[ 587.603196] env[61439]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 588.988450] env[61439]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=61439) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 588.988819] env[61439]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=61439) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 588.988856] env[61439]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=61439) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 588.989177] env[61439]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 589.182131] env[61439]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=61439) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 589.191860] env[61439]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=61439) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 589.299704] env[61439]: INFO nova.virt.driver [None req-5ce3ed96-5ba6-4b01-befc-e10b9584f9c6 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 589.384634] env[61439]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.384874] env[61439]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.384985] env[61439]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=61439) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 592.324076] env[61439]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-8f619636-0342-4092-871e-90aeb4642ead {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.340908] env[61439]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=61439) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 592.340978] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-5cb1caf3-9b73-4612-983e-7ed729d3a0ce {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.366026] env[61439]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 395da. [ 592.366175] env[61439]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.981s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 592.366781] env[61439]: INFO nova.virt.vmwareapi.driver [None req-5ce3ed96-5ba6-4b01-befc-e10b9584f9c6 None None] VMware vCenter version: 7.0.3 [ 592.370233] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4635c5ec-ee09-48ae-b042-a3301eb91742 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.388209] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a739a3a6-0fb2-4887-aaa4-1865bb992c50 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.394489] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc28daa5-98a5-4659-acc2-7d32b02420d2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.401152] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f884fc2e-ad63-4db9-ae77-8ac83f769938 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.415093] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec8d14ea-0ee5-498f-971d-7da6776a11c9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.421396] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-524eb2c0-3c4d-421a-b872-1722e8dffb38 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.452292] env[61439]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-172b2acf-a223-475c-a95e-a2124bdbc309 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.457484] env[61439]: DEBUG nova.virt.vmwareapi.driver [None req-5ce3ed96-5ba6-4b01-befc-e10b9584f9c6 None None] Extension org.openstack.compute already exists. {{(pid=61439) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 592.460211] env[61439]: INFO nova.compute.provider_config [None req-5ce3ed96-5ba6-4b01-befc-e10b9584f9c6 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 592.480158] env[61439]: DEBUG nova.context [None req-5ce3ed96-5ba6-4b01-befc-e10b9584f9c6 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),3368e979-e585-49c6-913f-77bff95fda6f(cell1) {{(pid=61439) load_cells /opt/stack/nova/nova/context.py:464}} [ 592.482169] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 592.482430] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 592.483150] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 592.483850] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Acquiring lock "3368e979-e585-49c6-913f-77bff95fda6f" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 592.483850] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Lock "3368e979-e585-49c6-913f-77bff95fda6f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 592.484755] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Lock "3368e979-e585-49c6-913f-77bff95fda6f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 592.511063] env[61439]: INFO dbcounter [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Registered counter for database nova_cell0 [ 592.520105] env[61439]: INFO dbcounter [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Registered counter for database nova_cell1 [ 592.523333] env[61439]: DEBUG oslo_db.sqlalchemy.engines [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=61439) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 592.523687] env[61439]: DEBUG oslo_db.sqlalchemy.engines [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=61439) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 592.528177] env[61439]: DEBUG dbcounter [-] [61439] Writer thread running {{(pid=61439) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 592.529309] env[61439]: DEBUG dbcounter [-] [61439] Writer thread running {{(pid=61439) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 592.531112] env[61439]: ERROR nova.db.main.api [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 592.531112] env[61439]: result = function(*args, **kwargs) [ 592.531112] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 592.531112] env[61439]: return func(*args, **kwargs) [ 592.531112] env[61439]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 592.531112] env[61439]: result = fn(*args, **kwargs) [ 592.531112] env[61439]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 592.531112] env[61439]: return f(*args, **kwargs) [ 592.531112] env[61439]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 592.531112] env[61439]: return db.service_get_minimum_version(context, binaries) [ 592.531112] env[61439]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 592.531112] env[61439]: _check_db_access() [ 592.531112] env[61439]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 592.531112] env[61439]: stacktrace = ''.join(traceback.format_stack()) [ 592.531112] env[61439]: [ 592.532206] env[61439]: ERROR nova.db.main.api [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 592.532206] env[61439]: result = function(*args, **kwargs) [ 592.532206] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 592.532206] env[61439]: return func(*args, **kwargs) [ 592.532206] env[61439]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 592.532206] env[61439]: result = fn(*args, **kwargs) [ 592.532206] env[61439]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 592.532206] env[61439]: return f(*args, **kwargs) [ 592.532206] env[61439]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 592.532206] env[61439]: return db.service_get_minimum_version(context, binaries) [ 592.532206] env[61439]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 592.532206] env[61439]: _check_db_access() [ 592.532206] env[61439]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 592.532206] env[61439]: stacktrace = ''.join(traceback.format_stack()) [ 592.532206] env[61439]: [ 592.532555] env[61439]: WARNING nova.objects.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Failed to get minimum service version for cell 3368e979-e585-49c6-913f-77bff95fda6f [ 592.533115] env[61439]: WARNING nova.objects.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 592.533211] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Acquiring lock "singleton_lock" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 592.533361] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Acquired lock "singleton_lock" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 592.533612] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Releasing lock "singleton_lock" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 592.533935] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Full set of CONF: {{(pid=61439) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 592.534091] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ******************************************************************************** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 592.534224] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] Configuration options gathered from: {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 592.534365] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 592.534587] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 592.534733] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ================================================================================ {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 592.534957] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] allow_resize_to_same_host = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.535146] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] arq_binding_timeout = 300 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.535282] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] backdoor_port = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.535412] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] backdoor_socket = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.535576] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] block_device_allocate_retries = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.535743] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] block_device_allocate_retries_interval = 3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.535915] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cert = self.pem {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.536094] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.536267] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute_monitors = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.536439] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] config_dir = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.536610] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] config_drive_format = iso9660 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.536749] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.536911] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] config_source = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.537095] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] console_host = devstack {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.537270] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] control_exchange = nova {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.537469] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cpu_allocation_ratio = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.537676] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] daemon = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.537882] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] debug = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.538060] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] default_access_ip_network_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.538282] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] default_availability_zone = nova {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.538479] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] default_ephemeral_format = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.538656] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] default_green_pool_size = 1000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.538926] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.539125] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] default_schedule_zone = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.539292] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] disk_allocation_ratio = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.539457] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] enable_new_services = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.539637] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] enabled_apis = ['osapi_compute'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.539802] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] enabled_ssl_apis = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.539962] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] flat_injected = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.540181] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] force_config_drive = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.540295] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] force_raw_images = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.540466] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] graceful_shutdown_timeout = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.540625] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] heal_instance_info_cache_interval = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.540848] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] host = cpu-1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.541029] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] initial_cpu_allocation_ratio = 4.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.541199] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] initial_disk_allocation_ratio = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.541364] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] initial_ram_allocation_ratio = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.541579] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.541742] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] instance_build_timeout = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.541901] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] instance_delete_interval = 300 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.542117] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] instance_format = [instance: %(uuid)s] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.542328] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] instance_name_template = instance-%08x {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.542502] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] instance_usage_audit = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.542674] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] instance_usage_audit_period = month {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.542844] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.543031] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] instances_path = /opt/stack/data/nova/instances {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.543228] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] internal_service_availability_zone = internal {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.543401] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] key = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.543566] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] live_migration_retry_count = 30 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.543733] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] log_config_append = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.543904] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.544078] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] log_dir = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.544246] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] log_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.544376] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] log_options = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.544539] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] log_rotate_interval = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.544707] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] log_rotate_interval_type = days {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.544875] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] log_rotation_type = none {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.545016] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.545148] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.545318] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.545488] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.545619] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.545780] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] long_rpc_timeout = 1800 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.545938] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] max_concurrent_builds = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.546113] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] max_concurrent_live_migrations = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.546274] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] max_concurrent_snapshots = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.546433] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] max_local_block_devices = 3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.546591] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] max_logfile_count = 30 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.546747] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] max_logfile_size_mb = 200 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.546903] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] maximum_instance_delete_attempts = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.547084] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] metadata_listen = 0.0.0.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.547267] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] metadata_listen_port = 8775 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.547453] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] metadata_workers = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.547617] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] migrate_max_retries = -1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.547784] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] mkisofs_cmd = genisoimage {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.547995] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] my_block_storage_ip = 10.180.1.21 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.548144] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] my_ip = 10.180.1.21 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.548310] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] network_allocate_retries = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.548493] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.548660] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] osapi_compute_listen = 0.0.0.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.548824] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] osapi_compute_listen_port = 8774 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.548989] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] osapi_compute_unique_server_name_scope = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.549248] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] osapi_compute_workers = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.549335] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] password_length = 12 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.549496] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] periodic_enable = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.549660] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] periodic_fuzzy_delay = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.549825] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] pointer_model = usbtablet {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.549989] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] preallocate_images = none {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.550161] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] publish_errors = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.550368] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] pybasedir = /opt/stack/nova {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.550452] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ram_allocation_ratio = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.550598] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] rate_limit_burst = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.550765] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] rate_limit_except_level = CRITICAL {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.550924] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] rate_limit_interval = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.551094] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] reboot_timeout = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.551258] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] reclaim_instance_interval = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.551418] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] record = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.551585] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] reimage_timeout_per_gb = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.551751] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] report_interval = 120 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.551913] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] rescue_timeout = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.552083] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] reserved_host_cpus = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.552267] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] reserved_host_disk_mb = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.552434] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] reserved_host_memory_mb = 512 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.552649] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] reserved_huge_pages = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.552828] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] resize_confirm_window = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.552992] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] resize_fs_using_block_device = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.553176] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] resume_guests_state_on_host_boot = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.553388] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.553603] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] rpc_response_timeout = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.553792] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] run_external_periodic_tasks = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.553970] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] running_deleted_instance_action = reap {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.554150] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] running_deleted_instance_poll_interval = 1800 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.554314] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] running_deleted_instance_timeout = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.554477] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler_instance_sync_interval = 120 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.554648] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_down_time = 720 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.554820] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] servicegroup_driver = db {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.554981] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] shelved_offload_time = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.555159] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] shelved_poll_interval = 3600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.555353] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] shutdown_timeout = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.555532] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] source_is_ipv6 = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.555694] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ssl_only = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.555957] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.556142] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] sync_power_state_interval = 600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.556308] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] sync_power_state_pool_size = 1000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.556479] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] syslog_log_facility = LOG_USER {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.556637] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] tempdir = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.556796] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] timeout_nbd = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.556964] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] transport_url = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.557141] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] update_resources_interval = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.557597] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] use_cow_images = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.557597] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] use_eventlog = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.557597] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] use_journal = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.557750] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] use_json = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.557911] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] use_rootwrap_daemon = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.558085] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] use_stderr = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.558249] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] use_syslog = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.558432] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vcpu_pin_set = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.558610] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plugging_is_fatal = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.558780] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plugging_timeout = 300 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.558949] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] virt_mkfs = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.559129] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] volume_usage_poll_interval = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.559386] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] watch_log_file = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.559467] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] web = /usr/share/spice-html5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 592.559653] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_concurrency.disable_process_locking = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.559970] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.560171] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.560438] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.560629] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_metrics.metrics_process_name = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.560799] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.561026] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.561226] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.auth_strategy = keystone {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.561430] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.compute_link_prefix = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.561622] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.561841] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.dhcp_domain = novalocal {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.562078] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.enable_instance_password = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.562290] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.glance_link_prefix = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.562472] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.562649] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.instance_list_cells_batch_strategy = distributed {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.562812] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.instance_list_per_project_cells = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.562976] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.list_records_by_skipping_down_cells = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.563157] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.local_metadata_per_cell = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.563328] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.max_limit = 1000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.563493] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.metadata_cache_expiration = 15 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.563665] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.neutron_default_tenant_id = default {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.563829] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.use_forwarded_for = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.563994] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.use_neutron_default_nets = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.564177] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.564343] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.vendordata_dynamic_failure_fatal = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.564559] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.564750] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.vendordata_dynamic_ssl_certfile = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.564930] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.vendordata_dynamic_targets = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.565109] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.vendordata_jsonfile_path = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.565291] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api.vendordata_providers = ['StaticJSON'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.565488] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.backend = dogpile.cache.memcached {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.565658] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.backend_argument = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.565832] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.config_prefix = cache.oslo {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.566015] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.dead_timeout = 60.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.566191] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.debug_cache_backend = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.566363] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.enable_retry_client = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.566524] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.enable_socket_keepalive = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.566695] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.enabled = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.566861] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.expiration_time = 600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.567051] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.hashclient_retry_attempts = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.567205] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.hashclient_retry_delay = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.567375] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_dead_retry = 300 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.567582] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_password = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.567755] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.567919] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.568095] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_pool_maxsize = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.568262] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_pool_unused_timeout = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.568427] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_sasl_enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.568607] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_servers = ['localhost:11211'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.568774] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_socket_timeout = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.568946] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.memcache_username = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.569127] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.proxies = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.569295] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.retry_attempts = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.569527] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.retry_delay = 0.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.569629] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.socket_keepalive_count = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.569789] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.socket_keepalive_idle = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.569951] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.socket_keepalive_interval = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.570125] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.tls_allowed_ciphers = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.570287] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.tls_cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.570462] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.tls_certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.570653] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.tls_enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.570805] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cache.tls_keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.570976] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.571171] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.auth_type = password {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.571339] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.571522] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.catalog_info = volumev3::publicURL {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.571686] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.571850] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.572026] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.cross_az_attach = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.572242] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.debug = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.572395] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.endpoint_template = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.572565] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.http_retries = 3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.572733] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.572897] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.573084] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.os_region_name = RegionOne {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.573280] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.573456] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cinder.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.573662] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.573789] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.cpu_dedicated_set = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.573951] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.cpu_shared_set = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.574164] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.image_type_exclude_list = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.574302] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.live_migration_wait_for_vif_plug = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.574469] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.max_concurrent_disk_ops = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.574634] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.max_disk_devices_to_attach = -1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.574796] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.575045] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.575148] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.resource_provider_association_refresh = 300 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.575316] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.shutdown_retry_interval = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.575500] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.575682] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] conductor.workers = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.575859] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] console.allowed_origins = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.576047] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] console.ssl_ciphers = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.576252] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] console.ssl_minimum_version = default {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.576463] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] consoleauth.token_ttl = 600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.576641] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.576808] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.576976] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.577377] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.connect_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.577562] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.connect_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.577730] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.endpoint_override = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.577897] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.578076] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.578261] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.max_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.578487] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.min_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.578565] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.region_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.578722] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.service_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.578894] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.service_type = accelerator {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.579069] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.579836] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.status_code_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.579836] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.status_code_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.579836] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.579836] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.579960] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] cyborg.version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.580139] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.backend = sqlalchemy {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.580329] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.connection = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.580509] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.connection_debug = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.580749] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.connection_parameters = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.580950] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.connection_recycle_time = 3600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.581101] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.connection_trace = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.581291] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.db_inc_retry_interval = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.581463] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.db_max_retries = 20 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.581629] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.db_max_retry_interval = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.581793] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.db_retry_interval = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.581965] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.max_overflow = 50 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.582147] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.max_pool_size = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.582378] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.max_retries = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.582577] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.mysql_sql_mode = TRADITIONAL {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.582744] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.mysql_wsrep_sync_wait = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.582911] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.pool_timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.583096] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.retry_interval = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.583267] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.slave_connection = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.583438] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.sqlite_synchronous = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.583602] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] database.use_db_reconnect = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.583787] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.backend = sqlalchemy {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.583967] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.connection = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.584154] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.connection_debug = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.584331] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.connection_parameters = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.584498] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.connection_recycle_time = 3600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.584666] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.connection_trace = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.584828] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.db_inc_retry_interval = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.584994] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.db_max_retries = 20 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.585172] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.db_max_retry_interval = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.585358] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.db_retry_interval = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.585555] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.max_overflow = 50 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.585724] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.max_pool_size = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.585895] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.max_retries = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.586080] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.586249] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.mysql_wsrep_sync_wait = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.586420] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.pool_timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.586592] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.retry_interval = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.586752] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.slave_connection = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.586919] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] api_database.sqlite_synchronous = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.587109] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] devices.enabled_mdev_types = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.587295] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.587465] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ephemeral_storage_encryption.enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.587636] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ephemeral_storage_encryption.key_size = 512 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.587810] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.api_servers = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.587976] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.588157] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.588322] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.588511] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.connect_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.588681] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.connect_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.588847] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.debug = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.589026] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.default_trusted_certificate_ids = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.589199] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.enable_certificate_validation = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.589368] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.enable_rbd_download = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.589528] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.endpoint_override = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.589698] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.589864] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.590034] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.max_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.590199] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.min_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.590368] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.num_retries = 3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.590540] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.rbd_ceph_conf = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.590708] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.rbd_connect_timeout = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.590897] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.rbd_pool = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.591060] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.rbd_user = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.591221] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.region_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.591398] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.service_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.591588] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.service_type = image {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.591756] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.591918] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.status_code_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.592091] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.status_code_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.592281] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.592474] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.592646] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.verify_glance_signatures = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.592810] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] glance.version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.593067] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] guestfs.debug = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.593262] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.config_drive_cdrom = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.593433] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.config_drive_inject_password = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.593605] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.593775] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.enable_instance_metrics_collection = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.593941] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.enable_remotefx = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.594127] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.instances_path_share = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.594301] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.iscsi_initiator_list = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.594491] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.limit_cpu_features = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.594672] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.594838] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.595008] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.power_state_check_timeframe = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.595190] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.power_state_event_polling_interval = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.595371] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.595536] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.use_multipath_io = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.595700] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.volume_attach_retry_count = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.595863] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.volume_attach_retry_interval = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.596109] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.vswitch_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.596284] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.596461] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] mks.enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.596837] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.597044] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] image_cache.manager_interval = 2400 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.597594] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] image_cache.precache_concurrency = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.597594] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] image_cache.remove_unused_base_images = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.597594] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.597778] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.597954] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] image_cache.subdirectory_name = _base {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.598154] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.api_max_retries = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.598326] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.api_retry_interval = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.598492] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.598659] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.auth_type = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.598827] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.598992] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.599178] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.599349] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.conductor_group = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.599512] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.connect_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.599675] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.connect_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.599839] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.endpoint_override = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.600014] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.600179] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.600343] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.max_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.600525] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.min_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.600709] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.peer_list = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.600878] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.region_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.601069] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.serial_console_state_timeout = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.601219] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.service_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.601390] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.service_type = baremetal {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.601551] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.601712] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.status_code_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.601872] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.status_code_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.602044] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.602266] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.602446] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ironic.version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.602640] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.602821] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] key_manager.fixed_key = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.603021] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.603215] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.barbican_api_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.603400] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.barbican_endpoint = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.603582] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.barbican_endpoint_type = public {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.603744] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.barbican_region_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.603906] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.604081] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.604251] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.604447] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.604615] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.604780] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.number_of_retries = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.604943] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.retry_delay = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.605127] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.send_service_user_token = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.605297] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.605464] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.605625] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.verify_ssl = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.605785] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican.verify_ssl_path = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.605954] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.606134] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.auth_type = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.606323] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.606501] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.606672] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.606839] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.607007] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.607182] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.607345] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] barbican_service_user.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.607518] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.approle_role_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.607682] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.approle_secret_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.607843] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.608008] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.608185] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.608353] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.608514] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.608687] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.kv_mountpoint = secret {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.608848] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.kv_path = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.609022] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.kv_version = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.609187] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.namespace = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.609377] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.root_token_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.609555] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.609718] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.ssl_ca_crt_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.609881] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.610059] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.use_ssl = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.610301] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.610464] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.610636] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.auth_type = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.610800] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.610961] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.611229] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.611307] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.connect_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.611455] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.connect_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.611633] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.endpoint_override = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.611772] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.611934] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.612105] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.max_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.612292] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.min_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.612463] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.region_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.612623] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.service_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.612795] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.service_type = identity {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.612961] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.613136] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.status_code_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.613302] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.status_code_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.613463] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.613647] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.613810] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] keystone.version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.614021] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.connection_uri = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.614191] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.cpu_mode = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.614710] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.cpu_model_extra_flags = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.614710] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.cpu_models = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.614710] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.cpu_power_governor_high = performance {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.614883] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.cpu_power_governor_low = powersave {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.615030] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.cpu_power_management = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.615207] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.615379] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.device_detach_attempts = 8 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.615577] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.device_detach_timeout = 20 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.615705] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.disk_cachemodes = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.615868] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.disk_prefix = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.616048] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.enabled_perf_events = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.616221] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.file_backed_memory = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.616412] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.gid_maps = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.616587] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.hw_disk_discard = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.616748] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.hw_machine_type = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.616921] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.images_rbd_ceph_conf = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.617100] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.617282] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.617460] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.images_rbd_glance_store_name = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.617636] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.images_rbd_pool = rbd {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.617809] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.images_type = default {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.617974] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.images_volume_group = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.618153] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.inject_key = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.618319] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.inject_partition = -2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.618484] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.inject_password = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.618651] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.iscsi_iface = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.618816] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.iser_use_multipath = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.618982] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_bandwidth = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.619163] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_completion_timeout = 800 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.619329] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_downtime = 500 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.619495] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_downtime_delay = 75 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.619665] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_downtime_steps = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.619832] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_inbound_addr = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.620015] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_permit_auto_converge = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.620196] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_permit_post_copy = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.620453] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_scheme = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.620559] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_timeout_action = abort {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.620726] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_tunnelled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.620890] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_uri = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.621069] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.live_migration_with_native_tls = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.621234] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.max_queues = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.621401] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.mem_stats_period_seconds = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.621578] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.nfs_mount_options = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.621886] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.622075] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.num_aoe_discover_tries = 3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.622278] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.num_iser_scan_tries = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.622452] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.num_memory_encrypted_guests = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.622622] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.num_nvme_discover_tries = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.622789] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.num_pcie_ports = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.622958] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.num_volume_scan_tries = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.623148] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.pmem_namespaces = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.623339] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.quobyte_client_cfg = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.623647] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.623823] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rbd_connect_timeout = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.623995] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.624179] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.624370] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rbd_secret_uuid = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.624601] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rbd_user = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.624779] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.realtime_scheduler_priority = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.624956] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.remote_filesystem_transport = ssh {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.625137] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rescue_image_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.625303] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rescue_kernel_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.625471] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rescue_ramdisk_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.625709] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rng_dev_path = /dev/urandom {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.625802] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.rx_queue_size = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.625971] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.smbfs_mount_options = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.626270] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.626450] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.snapshot_compression = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.626616] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.snapshot_image_format = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.626840] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.627019] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.sparse_logical_volumes = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.627191] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.swtpm_enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.627369] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.swtpm_group = tss {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.627539] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.swtpm_user = tss {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.627712] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.sysinfo_serial = unique {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.627874] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.tb_cache_size = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.628047] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.tx_queue_size = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.628221] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.uid_maps = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.628413] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.use_virtio_for_bridges = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.628606] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.virt_type = kvm {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.628781] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.volume_clear = zero {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.628947] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.volume_clear_size = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.629134] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.volume_use_multipath = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.629303] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.vzstorage_cache_path = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.629477] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.629650] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.vzstorage_mount_group = qemu {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.629816] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.vzstorage_mount_opts = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.629987] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.630278] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.630460] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.vzstorage_mount_user = stack {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.630631] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.630809] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.630985] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.auth_type = password {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.631166] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.631333] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.631501] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.631698] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.connect_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.631895] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.connect_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.632066] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.default_floating_pool = public {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.632171] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.endpoint_override = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.632354] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.extension_sync_interval = 600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.632525] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.http_retries = 3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.632688] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.632851] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.633022] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.max_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.633199] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.metadata_proxy_shared_secret = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.633365] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.min_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.633539] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.ovs_bridge = br-int {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.633706] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.physnets = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.633881] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.region_name = RegionOne {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.634065] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.service_metadata_proxy = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.634236] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.service_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.634410] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.service_type = network {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.634578] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.634739] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.status_code_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.634901] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.status_code_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.635073] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.635262] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.635427] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] neutron.version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.635602] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] notifications.bdms_in_notifications = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.635832] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] notifications.default_level = INFO {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.635954] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] notifications.notification_format = unversioned {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.636135] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] notifications.notify_on_state_change = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.636316] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.636499] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] pci.alias = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.636812] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] pci.device_spec = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.636881] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] pci.report_in_placement = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.637022] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.637198] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.auth_type = password {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.637372] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.auth_url = http://10.180.1.21/identity {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.637538] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.637699] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.637863] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.638029] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.connect_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.638197] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.connect_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.638359] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.default_domain_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.638521] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.default_domain_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.638682] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.domain_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.638840] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.domain_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.639018] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.endpoint_override = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.639181] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.639366] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.639526] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.max_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.639660] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.min_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.639830] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.password = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.639993] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.project_domain_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.640188] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.project_domain_name = Default {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.640391] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.project_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.640576] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.project_name = service {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.640750] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.region_name = RegionOne {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.640911] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.service_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.641095] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.service_type = placement {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.641265] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.641428] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.status_code_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.641589] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.status_code_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.641818] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.system_scope = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.641904] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.642081] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.trust_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.642261] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.user_domain_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.642446] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.user_domain_name = Default {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.642610] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.user_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.642787] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.username = placement {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.642974] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.643159] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] placement.version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.643364] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.cores = 20 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.643538] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.count_usage_from_placement = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.643714] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.643887] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.injected_file_content_bytes = 10240 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.644069] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.injected_file_path_length = 255 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.644242] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.injected_files = 5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.644412] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.instances = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.644582] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.key_pairs = 100 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.644752] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.metadata_items = 128 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.644921] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.ram = 51200 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.645099] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.recheck_quota = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.645273] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.server_group_members = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.645444] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] quota.server_groups = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.645617] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] rdp.enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.645949] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.646140] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.646314] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.646484] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.image_metadata_prefilter = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.646650] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.646815] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.max_attempts = 3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.646980] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.max_placement_results = 1000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.647161] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.647335] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.query_placement_for_image_type_support = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.647513] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.647690] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] scheduler.workers = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.647866] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.648051] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.648239] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.648415] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.648595] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.648767] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.648936] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.649142] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.649340] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.host_subset_size = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.649530] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.649698] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.image_properties_default_architecture = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.649866] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.650049] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.isolated_hosts = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.650225] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.isolated_images = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.650396] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.max_instances_per_host = 50 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.650562] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.650727] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.650895] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.pci_in_placement = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.651070] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.651238] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.651412] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.651578] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.651744] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.651932] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.652080] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.track_instance_changes = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.652284] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.652469] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] metrics.required = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.652636] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] metrics.weight_multiplier = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.652803] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] metrics.weight_of_unavailable = -10000.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.653029] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] metrics.weight_setting = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.653354] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.653536] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] serial_console.enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.653720] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] serial_console.port_range = 10000:20000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.653893] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.654077] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.654251] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] serial_console.serialproxy_port = 6083 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.654426] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.654603] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.auth_type = password {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.654768] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.654929] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.655107] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.655275] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.655439] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.655629] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.send_service_user_token = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.655796] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.655957] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] service_user.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.656146] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.agent_enabled = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.656310] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.656610] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.656809] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.html5proxy_host = 0.0.0.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.656982] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.html5proxy_port = 6082 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.657162] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.image_compression = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.657325] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.jpeg_compression = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.657489] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.playback_compression = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.657662] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.server_listen = 127.0.0.1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.657832] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.657996] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.streaming_mode = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.658173] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] spice.zlib_compression = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.658346] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] upgrade_levels.baseapi = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.658509] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] upgrade_levels.cert = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.658683] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] upgrade_levels.compute = auto {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.658844] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] upgrade_levels.conductor = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.659010] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] upgrade_levels.scheduler = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.659187] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.659353] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.auth_type = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.659514] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.659676] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.659842] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.660011] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.660181] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.660347] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.660507] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vendordata_dynamic_auth.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.660682] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.api_retry_count = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.660846] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.ca_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.661029] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.cache_prefix = devstack-image-cache {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.661207] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.cluster_name = testcl1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.661401] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.connection_pool_size = 10 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.661581] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.console_delay_seconds = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.661757] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.datastore_regex = ^datastore.* {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.661977] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.662165] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.host_password = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.662355] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.host_port = 443 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.662588] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.host_username = administrator@vsphere.local {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.662702] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.insecure = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.662866] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.integration_bridge = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.663045] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.maximum_objects = 100 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.663216] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.pbm_default_policy = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.663382] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.pbm_enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.663542] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.pbm_wsdl_location = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.663713] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.663872] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.serial_port_proxy_uri = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.664042] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.serial_port_service_uri = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.664217] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.task_poll_interval = 0.5 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.664394] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.use_linked_clone = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.664565] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.vnc_keymap = en-us {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.664734] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.vnc_port = 5900 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.664920] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vmware.vnc_port_total = 10000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.665156] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.auth_schemes = ['none'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.665346] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.665639] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.665827] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.666015] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.novncproxy_port = 6080 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.666202] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.server_listen = 127.0.0.1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.666381] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.666548] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.vencrypt_ca_certs = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.666714] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.vencrypt_client_cert = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.666876] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vnc.vencrypt_client_key = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.667073] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.667245] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.disable_deep_image_inspection = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.667415] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.disable_fallback_pcpu_query = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.667581] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.disable_group_policy_check_upcall = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.667745] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.667911] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.disable_rootwrap = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.668086] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.enable_numa_live_migration = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.668255] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.668422] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.668585] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.handle_virt_lifecycle_events = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.668748] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.libvirt_disable_apic = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.668908] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.never_download_image_if_on_rbd = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.669087] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.669257] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.669422] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.669586] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.669748] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.669907] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.670087] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.670253] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.670423] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.670607] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.670778] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.client_socket_timeout = 900 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.670947] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.default_pool_size = 1000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.671131] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.keep_alive = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.671304] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.max_header_line = 16384 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.671471] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.secure_proxy_ssl_header = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.671635] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.ssl_ca_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.671798] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.ssl_cert_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.671962] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.ssl_key_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.672187] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.tcp_keepidle = 600 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.672359] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.672531] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] zvm.ca_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.672705] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] zvm.cloud_connector_url = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.672989] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.673192] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] zvm.reachable_timeout = 300 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.673405] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.enforce_new_defaults = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.673590] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.enforce_scope = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.673774] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.policy_default_rule = default {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.673960] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.674178] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.policy_file = policy.yaml {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.674329] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.674497] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.674661] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.674822] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.674987] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.675176] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.675357] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.675535] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.connection_string = messaging:// {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.675705] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.enabled = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.675875] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.es_doc_type = notification {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.676053] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.es_scroll_size = 10000 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.676258] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.es_scroll_time = 2m {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.676391] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.filter_error_trace = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.676564] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.hmac_keys = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.676741] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.sentinel_service_name = mymaster {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.676908] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.socket_timeout = 0.1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.677086] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.trace_requests = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.677254] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler.trace_sqlalchemy = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.677442] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler_jaeger.process_tags = {} {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.677606] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler_jaeger.service_name_prefix = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.677774] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] profiler_otlp.service_name_prefix = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.677943] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] remote_debug.host = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.678118] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] remote_debug.port = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.678305] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.678473] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.678639] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.678805] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.678968] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.679146] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.679315] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.679482] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.679648] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.679810] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.679983] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.680167] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.680341] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.680510] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.680676] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.680852] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.681025] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.681195] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.681366] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.681532] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.681694] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.681860] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.682031] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.682238] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.682410] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.682578] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.ssl = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.682809] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.682928] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.683105] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.683350] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.683543] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_rabbit.ssl_version = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.683739] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.683910] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_notifications.retry = -1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.684108] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.684289] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_messaging_notifications.transport_url = **** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.684467] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.auth_section = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.684633] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.auth_type = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.684793] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.cafile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.684955] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.certfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.685138] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.collect_timing = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.685301] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.connect_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.685463] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.connect_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.685624] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.endpoint_id = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.685783] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.endpoint_override = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.685946] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.insecure = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.686117] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.keyfile = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.686357] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.max_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.686522] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.min_version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.686691] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.region_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.686853] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.service_name = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.687023] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.service_type = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.687199] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.split_loggers = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.687364] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.status_code_retries = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.687524] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.status_code_retry_delay = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.687682] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.timeout = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.687840] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.valid_interfaces = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.687999] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_limit.version = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.688182] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_reports.file_event_handler = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.688352] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_reports.file_event_handler_interval = 1 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.688514] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] oslo_reports.log_dir = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.688687] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.688849] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_linux_bridge_privileged.group = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.689014] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.689190] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.689357] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.689523] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_linux_bridge_privileged.user = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.689688] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.689848] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_ovs_privileged.group = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.690015] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_ovs_privileged.helper_command = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.690189] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.690356] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.690517] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] vif_plug_ovs_privileged.user = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.690688] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_linux_bridge.flat_interface = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.690868] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.691054] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.691233] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.691421] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.691573] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.691738] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.691901] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_linux_bridge.vlan_interface = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.692093] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.692302] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_ovs.isolate_vif = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.692480] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.692647] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.692818] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.692987] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_ovs.ovsdb_interface = native {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.693177] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_vif_ovs.per_port_bridge = False {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.693365] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] os_brick.lock_path = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.693539] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] privsep_osbrick.capabilities = [21] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.693701] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] privsep_osbrick.group = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.693862] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] privsep_osbrick.helper_command = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.694131] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.694308] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] privsep_osbrick.thread_pool_size = 8 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.694476] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] privsep_osbrick.user = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.694649] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.694809] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] nova_sys_admin.group = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.694967] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] nova_sys_admin.helper_command = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.695150] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.695336] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] nova_sys_admin.thread_pool_size = 8 {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.695517] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] nova_sys_admin.user = None {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 592.695654] env[61439]: DEBUG oslo_service.service [None req-7a97c2d0-e9d0-4375-8cb6-9399654c6c1d None None] ******************************************************************************** {{(pid=61439) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 592.696099] env[61439]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 592.706470] env[61439]: WARNING nova.virt.vmwareapi.driver [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 592.706939] env[61439]: INFO nova.virt.node [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Generated node identity b35c9fce-988b-4acc-b175-83b202107c41 [ 592.707203] env[61439]: INFO nova.virt.node [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Wrote node identity b35c9fce-988b-4acc-b175-83b202107c41 to /opt/stack/data/n-cpu-1/compute_id [ 592.721257] env[61439]: WARNING nova.compute.manager [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Compute nodes ['b35c9fce-988b-4acc-b175-83b202107c41'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 592.756031] env[61439]: INFO nova.compute.manager [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 592.778354] env[61439]: WARNING nova.compute.manager [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 592.778646] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 592.779054] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 592.779054] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 592.779273] env[61439]: DEBUG nova.compute.resource_tracker [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 592.780422] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-636912fe-6a37-4823-af8a-8dd44c39dbfd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.788907] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cc57b2a-d125-4e47-8dee-84ee343dca43 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.802597] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15ebb660-b65e-497f-8521-7247965d7554 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.808544] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225a4ec8-5908-4cf7-845a-855274905b6f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.838381] env[61439]: DEBUG nova.compute.resource_tracker [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181572MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 592.838496] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 592.838680] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 592.850304] env[61439]: WARNING nova.compute.resource_tracker [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] No compute node record for cpu-1:b35c9fce-988b-4acc-b175-83b202107c41: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host b35c9fce-988b-4acc-b175-83b202107c41 could not be found. [ 592.863094] env[61439]: INFO nova.compute.resource_tracker [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: b35c9fce-988b-4acc-b175-83b202107c41 [ 592.929369] env[61439]: DEBUG nova.compute.resource_tracker [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 592.929531] env[61439]: DEBUG nova.compute.resource_tracker [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 593.045712] env[61439]: INFO nova.scheduler.client.report [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] [req-501eac6e-4cc1-4638-ab6d-18ee10853f5d] Created resource provider record via placement API for resource provider with UUID b35c9fce-988b-4acc-b175-83b202107c41 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 593.062436] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4216b6f-9023-4ca4-b095-ba4718f809bb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.070250] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f23bc86-d331-4da4-b2cc-1a166bf4224e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.098989] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-445d9866-e428-40e3-a425-9c1faf67ed11 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.105624] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39864083-73a4-4191-a6a0-b9cc3cde2d02 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.118095] env[61439]: DEBUG nova.compute.provider_tree [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Updating inventory in ProviderTree for provider b35c9fce-988b-4acc-b175-83b202107c41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 593.155398] env[61439]: DEBUG nova.scheduler.client.report [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Updated inventory for provider b35c9fce-988b-4acc-b175-83b202107c41 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 593.155623] env[61439]: DEBUG nova.compute.provider_tree [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Updating resource provider b35c9fce-988b-4acc-b175-83b202107c41 generation from 0 to 1 during operation: update_inventory {{(pid=61439) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 593.155769] env[61439]: DEBUG nova.compute.provider_tree [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Updating inventory in ProviderTree for provider b35c9fce-988b-4acc-b175-83b202107c41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 593.210711] env[61439]: DEBUG nova.compute.provider_tree [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Updating resource provider b35c9fce-988b-4acc-b175-83b202107c41 generation from 1 to 2 during operation: update_traits {{(pid=61439) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 593.227402] env[61439]: DEBUG nova.compute.resource_tracker [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 593.227587] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.389s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 593.227746] env[61439]: DEBUG nova.service [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Creating RPC server for service compute {{(pid=61439) start /opt/stack/nova/nova/service.py:182}} [ 593.239927] env[61439]: DEBUG nova.service [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] Join ServiceGroup membership for this service compute {{(pid=61439) start /opt/stack/nova/nova/service.py:199}} [ 593.240122] env[61439]: DEBUG nova.servicegroup.drivers.db [None req-cab22aa6-962a-4d38-9ae9-440e397ae381 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=61439) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 602.530482] env[61439]: DEBUG dbcounter [-] [61439] Writing DB stats nova_cell1:SELECT=1 {{(pid=61439) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 602.531804] env[61439]: DEBUG dbcounter [-] [61439] Writing DB stats nova_cell0:SELECT=1 {{(pid=61439) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 604.241643] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_power_states {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 604.252206] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Getting list of instances from cluster (obj){ [ 604.252206] env[61439]: value = "domain-c8" [ 604.252206] env[61439]: _type = "ClusterComputeResource" [ 604.252206] env[61439]: } {{(pid=61439) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 604.253261] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c5e2fda-d235-43a3-8125-9383d829f5c1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.262313] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Got total of 0 instances {{(pid=61439) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 604.262525] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 604.262829] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Getting list of instances from cluster (obj){ [ 604.262829] env[61439]: value = "domain-c8" [ 604.262829] env[61439]: _type = "ClusterComputeResource" [ 604.262829] env[61439]: } {{(pid=61439) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 604.263643] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bab2784-06f0-44dc-bee6-e55c900c251d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.270948] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Got total of 0 instances {{(pid=61439) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 637.924988] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquiring lock "2afc8edb-3331-476a-bda3-4f8071461084" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.924988] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Lock "2afc8edb-3331-476a-bda3-4f8071461084" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.958444] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 638.091517] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 638.091784] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 638.093840] env[61439]: INFO nova.compute.claims [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 638.254676] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76ef5ed6-5f8a-4e57-a77f-13bb33073eb5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.263884] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-199f348f-7841-49fc-a5b2-7a845ec34f90 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.306759] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c095a74-2ef6-43ff-b7e2-7bee35c347d6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.312737] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "224afc94-367d-4d30-95fb-b8b865f56eb9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 638.313072] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "224afc94-367d-4d30-95fb-b8b865f56eb9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 638.322625] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4949d1a-bc96-44fe-8f4f-b48f8e87cb74 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.329302] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 638.340270] env[61439]: DEBUG nova.compute.provider_tree [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 638.359714] env[61439]: DEBUG nova.scheduler.client.report [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 638.394378] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 638.394811] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 638.411869] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 638.412441] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 638.414689] env[61439]: INFO nova.compute.claims [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 638.446406] env[61439]: DEBUG nova.compute.utils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 638.448126] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Not allocating networking since 'none' was specified. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 638.464690] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 638.519959] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9356038-1f5d-4ca1-87ab-88c86c89a334 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.528922] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14254638-f420-49e7-8798-43c1401f111d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.565493] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 638.568073] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eaea045-a7b8-4dad-af1a-7c4cb5391962 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.580029] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89f7b024-039d-4c97-8de7-8b373eebc58f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.596114] env[61439]: DEBUG nova.compute.provider_tree [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 638.608288] env[61439]: DEBUG nova.scheduler.client.report [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 638.634318] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.222s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 638.634872] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 638.698484] env[61439]: DEBUG nova.compute.utils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 638.700159] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 638.700268] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 638.722090] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 638.799802] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 638.802140] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 638.802804] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 638.802804] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 638.802804] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 638.803037] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 638.803121] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 638.803295] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 638.803649] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 638.803815] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 638.803987] env[61439]: DEBUG nova.virt.hardware [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 638.805021] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d658b028-9dd0-43da-8564-4fe447fdc4f7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.813933] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f214373-3c53-4ec9-a1b8-9a4d6ecf4ab6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.821906] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 638.837685] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83091054-9e06-4f58-b912-63431d0e8839 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.863392] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 638.863733] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 638.863836] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 638.864023] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 638.864178] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 638.864328] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 638.864539] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 638.864698] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 638.864865] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 638.865032] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 638.865208] env[61439]: DEBUG nova.virt.hardware [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 638.866402] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d3baa1a-ecba-440a-9595-601b54413534 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.877478] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Instance VIF info [] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 638.889090] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.890793] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f039c3f2-1059-4f35-b372-274fe5716c86 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.904780] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80bfb5df-3a30-4d67-92e4-4c8603dc9bfa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.909270] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Created folder: OpenStack in parent group-v4. [ 638.909270] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Creating folder: Project (2e934b696b374e718726f1a9bc80553e). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.909505] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c35950ad-772c-469b-aeb8-1ad06a2d9531 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.925322] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Created folder: Project (2e934b696b374e718726f1a9bc80553e) in parent group-v221281. [ 638.925623] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Creating folder: Instances. Parent ref: group-v221282. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.926061] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-76de1d73-603b-4590-b979-d3feaba5726a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.937693] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Created folder: Instances in parent group-v221282. [ 638.937949] env[61439]: DEBUG oslo.service.loopingcall [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 638.940509] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 638.940509] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a57f2ddc-08b9-45b3-8b17-0695e954183b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.968379] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 638.968379] env[61439]: value = "task-987640" [ 638.968379] env[61439]: _type = "Task" [ 638.968379] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 638.980349] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987640, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 639.101138] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Acquiring lock "0b4e2497-f306-4bfc-b862-26e53e827c16" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.101403] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Lock "0b4e2497-f306-4bfc-b862-26e53e827c16" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.142119] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 639.244158] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 639.244434] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 639.245978] env[61439]: INFO nova.compute.claims [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 639.417086] env[61439]: DEBUG nova.policy [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '92f701ef37364c629ed2fed539cda2cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b8d93352efbf41be917afa7c0b5a1ed3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 639.417086] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04c43a46-ee72-45ff-9c75-9df5a57e5912 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.424355] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21787fd9-f9ee-4883-9a15-fbd48ff2d881 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.461257] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f36877-2d9c-4310-9552-6b7ccf7da39c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.474484] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf8d7c8-4201-4635-9a85-f8832958fc3f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.483702] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987640, 'name': CreateVM_Task, 'duration_secs': 0.290016} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 639.493506] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 639.493506] env[61439]: DEBUG nova.compute.provider_tree [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 639.494306] env[61439]: DEBUG oslo_vmware.service [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-018312f0-3451-453f-8ee9-848d87f33085 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.502517] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 639.502691] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 639.503429] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 639.503690] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-caa6cd1b-eb2f-4d85-8793-9c4bb3893b7d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.507619] env[61439]: DEBUG nova.scheduler.client.report [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 639.513684] env[61439]: DEBUG oslo_vmware.api [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Waiting for the task: (returnval){ [ 639.513684] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]524b0fca-8adc-6922-74bf-168728816744" [ 639.513684] env[61439]: _type = "Task" [ 639.513684] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 639.523118] env[61439]: DEBUG oslo_vmware.api [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]524b0fca-8adc-6922-74bf-168728816744, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 639.532672] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 639.532807] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 639.594799] env[61439]: DEBUG nova.compute.utils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 639.596923] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 639.597230] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 639.612096] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 639.704159] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 639.738211] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 639.738465] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 639.738638] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 639.738886] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 639.741580] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 639.744615] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 639.744615] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 639.744615] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 639.744615] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 639.744615] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 639.744815] env[61439]: DEBUG nova.virt.hardware [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 639.744815] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1619e143-b1a5-4450-84dd-d3923909f28b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.757605] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c62fb58b-0cdf-4546-ad55-2309420c30e7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.896133] env[61439]: DEBUG nova.policy [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6ec8ca0476d2409cb087e4d3e92fdd7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9bf93d58be7b48c298cc4ba7d350b918', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 640.027797] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 640.028206] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 640.028528] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.029489] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 640.029489] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 640.029489] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5c233504-f0f6-48ab-8e39-d5a9c0d9c0ce {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.053629] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 640.053925] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 640.059099] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08267faa-6e05-46dc-87b4-0abbae046522 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.069209] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-578910d5-2070-4c5a-8147-fbacbaf80e8c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.077129] env[61439]: DEBUG oslo_vmware.api [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Waiting for the task: (returnval){ [ 640.077129] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c92cac-3b4a-f18b-fc2d-dbfaf6ad6bc3" [ 640.077129] env[61439]: _type = "Task" [ 640.077129] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 640.088849] env[61439]: DEBUG oslo_vmware.api [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c92cac-3b4a-f18b-fc2d-dbfaf6ad6bc3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 640.544960] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "10f688a0-63af-496f-93dd-794083acf94b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.545567] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "10f688a0-63af-496f-93dd-794083acf94b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.562723] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 640.587572] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 640.587982] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Creating directory with path [datastore2] vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 640.588311] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a563a371-7e4f-4fa1-8126-3ae5466de52f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.623100] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.624030] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.626687] env[61439]: INFO nova.compute.claims [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 640.632017] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Created directory with path [datastore2] vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 640.632017] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Fetch image to [datastore2] vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 640.632017] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 640.632017] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ff1fc9e-4236-45cd-8dbb-9afe6fb0f692 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.637958] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee40745a-e8aa-4596-9e1f-b0244a333d32 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.652307] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2de25343-c33c-46b3-995e-3ba9944a78c8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.697678] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-978e86ba-beac-42b7-8fe9-9559a11a1ae8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.705335] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fb7ae211-9c48-4272-9a07-86a29d868620 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.733945] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 640.815746] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b60e4476-93cb-4f86-b654-571d88455c8d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.826934] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-460bab4a-6a86-4301-88ec-ad81ecb82a46 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.859776] env[61439]: DEBUG oslo_vmware.rw_handles [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 640.862568] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-296bb331-7be2-43b5-b0b0-3499a2554286 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.924489] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae726606-435e-4dc5-b61e-dcdc07923d09 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.928736] env[61439]: DEBUG oslo_vmware.rw_handles [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 640.928927] env[61439]: DEBUG oslo_vmware.rw_handles [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 640.940995] env[61439]: DEBUG nova.compute.provider_tree [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 640.952819] env[61439]: DEBUG nova.scheduler.client.report [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 640.971618] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 640.972132] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 641.013028] env[61439]: DEBUG nova.compute.utils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 641.014268] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 641.014446] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 641.025652] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 641.102543] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 641.139616] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 641.140481] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 641.140481] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 641.140481] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 641.140481] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 641.140941] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 641.140941] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 641.141044] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 641.141194] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 641.141359] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 641.141536] env[61439]: DEBUG nova.virt.hardware [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 641.143128] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0e23402-366a-4044-9bdd-5f2c030884f3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.152464] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef3c0fbb-6a90-4c2e-9357-5f9f7e3c8ec2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.428309] env[61439]: DEBUG nova.policy [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfae70c3a6a94df4a6689ac18626fe37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '870604e0902d43528453a2950d1080d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 641.490499] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Successfully created port: 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 642.070876] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Successfully created port: 88a8eeb7-54a6-49a9-aae8-984e651885b1 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 643.017106] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Successfully created port: 54a2734c-5736-4c20-a208-d59449f937f8 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 643.420737] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Acquiring lock "025daa70-33c7-4a0d-addf-94b680ad8c4e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.421106] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Lock "025daa70-33c7-4a0d-addf-94b680ad8c4e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.436941] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 643.515463] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.515463] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.516566] env[61439]: INFO nova.compute.claims [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 643.672058] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94704cbf-92ae-4186-9f26-59db87adc5a3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.684013] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e5d71f6-3e93-4464-8a20-fa8889f4dfcb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.720343] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-850e4e71-aa00-482a-81f2-b0012e35ddd7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.728791] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0e58ddd-5262-45ba-ab8f-3d353e91c508 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.743609] env[61439]: DEBUG nova.compute.provider_tree [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 643.753118] env[61439]: DEBUG nova.scheduler.client.report [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 643.767901] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 643.768111] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 643.811085] env[61439]: DEBUG nova.compute.utils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 643.816988] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 643.818018] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 643.830521] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 643.918864] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 643.953803] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 643.954045] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 643.954731] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 643.954974] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 643.955735] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 643.955991] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 643.956366] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 643.956565] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 643.956743] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 643.956990] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 643.957214] env[61439]: DEBUG nova.virt.hardware [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 643.958684] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce892e54-63c1-4f8f-9d14-52eaf07c2886 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.968789] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6b3d013-bb2a-4041-8b3d-e38306e7aae3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.174813] env[61439]: DEBUG nova.policy [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd18b73c168694833848a4b8625885086', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '36ef9b8223f9482e822ab862327f5a4b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 646.563454] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Successfully created port: 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 648.211602] env[61439]: ERROR nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. [ 648.211602] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 648.211602] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 648.211602] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 648.211602] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 648.211602] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 648.211602] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 648.211602] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 648.211602] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 648.211602] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 648.211602] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 648.211602] env[61439]: ERROR nova.compute.manager raise self.value [ 648.211602] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 648.211602] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 648.211602] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 648.211602] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 648.212286] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 648.212286] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 648.212286] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. [ 648.212286] env[61439]: ERROR nova.compute.manager [ 648.212415] env[61439]: Traceback (most recent call last): [ 648.212415] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 648.212415] env[61439]: listener.cb(fileno) [ 648.212415] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 648.212415] env[61439]: result = function(*args, **kwargs) [ 648.212415] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 648.212415] env[61439]: return func(*args, **kwargs) [ 648.212415] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 648.212415] env[61439]: raise e [ 648.212415] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 648.212415] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 648.212415] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 648.212415] env[61439]: created_port_ids = self._update_ports_for_instance( [ 648.212415] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 648.212415] env[61439]: with excutils.save_and_reraise_exception(): [ 648.212415] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 648.212415] env[61439]: self.force_reraise() [ 648.212415] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 648.212415] env[61439]: raise self.value [ 648.212415] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 648.212415] env[61439]: updated_port = self._update_port( [ 648.212415] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 648.212415] env[61439]: _ensure_no_port_binding_failure(port) [ 648.212415] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 648.212415] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 648.212415] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. [ 648.212415] env[61439]: Removing descriptor: 20 [ 648.213953] env[61439]: ERROR nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Traceback (most recent call last): [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] yield resources [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self.driver.spawn(context, instance, image_meta, [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self._vmops.spawn(context, instance, image_meta, injected_files, [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] vm_ref = self.build_virtual_machine(instance, [ 648.213953] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] vif_infos = vmwarevif.get_vif_info(self._session, [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] for vif in network_info: [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] return self._sync_wrapper(fn, *args, **kwargs) [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self.wait() [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self[:] = self._gt.wait() [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] return self._exit_event.wait() [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 648.214347] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] result = hub.switch() [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] return self.greenlet.switch() [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] result = function(*args, **kwargs) [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] return func(*args, **kwargs) [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] raise e [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] nwinfo = self.network_api.allocate_for_instance( [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] created_port_ids = self._update_ports_for_instance( [ 648.214718] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] with excutils.save_and_reraise_exception(): [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self.force_reraise() [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] raise self.value [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] updated_port = self._update_port( [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] _ensure_no_port_binding_failure(port) [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] raise exception.PortBindingFailed(port_id=port['id']) [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] nova.exception.PortBindingFailed: Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. [ 648.215066] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] [ 648.215422] env[61439]: INFO nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Terminating instance [ 648.216768] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Acquiring lock "refresh_cache-0b4e2497-f306-4bfc-b862-26e53e827c16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 648.216849] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Acquired lock "refresh_cache-0b4e2497-f306-4bfc-b862-26e53e827c16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 648.216980] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 648.266920] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 648.634459] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 648.651979] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Releasing lock "refresh_cache-0b4e2497-f306-4bfc-b862-26e53e827c16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 648.655673] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 648.655673] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 648.655673] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d04590cf-2ce9-4cf7-be37-c49174997e6d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.666045] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22e73f19-66d0-4ab0-b602-07cba9eafa02 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.694715] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0b4e2497-f306-4bfc-b862-26e53e827c16 could not be found. [ 648.694949] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 648.695361] env[61439]: INFO nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Took 0.04 seconds to destroy the instance on the hypervisor. [ 648.695623] env[61439]: DEBUG oslo.service.loopingcall [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 648.696153] env[61439]: DEBUG nova.compute.manager [-] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 648.696254] env[61439]: DEBUG nova.network.neutron [-] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 648.733904] env[61439]: DEBUG nova.network.neutron [-] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 648.756495] env[61439]: DEBUG nova.network.neutron [-] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 648.771587] env[61439]: INFO nova.compute.manager [-] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Took 0.07 seconds to deallocate network for instance. [ 648.774665] env[61439]: DEBUG nova.compute.claims [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 648.774747] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 648.775535] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 648.956539] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1042cb3-6dbe-4cc0-872b-6f90a5a652c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.968614] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f737e54b-5829-4883-ba58-b9158efbaeb9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.010340] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cd2405c-9bbc-41f7-930d-628296e24718 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.018175] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3367de36-e562-4892-b101-7d924592721f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.033615] env[61439]: DEBUG nova.compute.provider_tree [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 649.048350] env[61439]: DEBUG nova.scheduler.client.report [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 649.068957] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.293s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 649.068957] env[61439]: ERROR nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. [ 649.068957] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Traceback (most recent call last): [ 649.068957] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 649.068957] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self.driver.spawn(context, instance, image_meta, [ 649.068957] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 649.068957] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self._vmops.spawn(context, instance, image_meta, injected_files, [ 649.068957] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 649.068957] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] vm_ref = self.build_virtual_machine(instance, [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] vif_infos = vmwarevif.get_vif_info(self._session, [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] for vif in network_info: [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] return self._sync_wrapper(fn, *args, **kwargs) [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self.wait() [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self[:] = self._gt.wait() [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] return self._exit_event.wait() [ 649.069294] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] result = hub.switch() [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] return self.greenlet.switch() [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] result = function(*args, **kwargs) [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] return func(*args, **kwargs) [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] raise e [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] nwinfo = self.network_api.allocate_for_instance( [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 649.069719] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] created_port_ids = self._update_ports_for_instance( [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] with excutils.save_and_reraise_exception(): [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] self.force_reraise() [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] raise self.value [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] updated_port = self._update_port( [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] _ensure_no_port_binding_failure(port) [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 649.070103] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] raise exception.PortBindingFailed(port_id=port['id']) [ 649.070435] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] nova.exception.PortBindingFailed: Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. [ 649.070435] env[61439]: ERROR nova.compute.manager [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] [ 649.070435] env[61439]: DEBUG nova.compute.utils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 649.081886] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Build of instance 0b4e2497-f306-4bfc-b862-26e53e827c16 was re-scheduled: Binding failed for port 0066a37e-57c5-4cb7-a8b6-57709c4e4cc3, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 649.081886] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 649.081886] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Acquiring lock "refresh_cache-0b4e2497-f306-4bfc-b862-26e53e827c16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 649.081886] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Acquired lock "refresh_cache-0b4e2497-f306-4bfc-b862-26e53e827c16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 649.082225] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 649.184019] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 649.214232] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.214603] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.214816] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 649.214939] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 649.231824] env[61439]: ERROR nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. [ 649.231824] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 649.231824] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 649.231824] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 649.231824] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 649.231824] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 649.231824] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 649.231824] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 649.231824] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 649.231824] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 649.231824] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 649.231824] env[61439]: ERROR nova.compute.manager raise self.value [ 649.231824] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 649.231824] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 649.231824] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 649.231824] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 649.233652] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 649.233652] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 649.233652] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. [ 649.233652] env[61439]: ERROR nova.compute.manager [ 649.233652] env[61439]: Traceback (most recent call last): [ 649.233652] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 649.233652] env[61439]: listener.cb(fileno) [ 649.233652] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 649.233652] env[61439]: result = function(*args, **kwargs) [ 649.233652] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 649.233652] env[61439]: return func(*args, **kwargs) [ 649.233652] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 649.233652] env[61439]: raise e [ 649.233652] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 649.233652] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 649.233652] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 649.233652] env[61439]: created_port_ids = self._update_ports_for_instance( [ 649.233652] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 649.233652] env[61439]: with excutils.save_and_reraise_exception(): [ 649.233652] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 649.233652] env[61439]: self.force_reraise() [ 649.233652] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 649.233652] env[61439]: raise self.value [ 649.233652] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 649.233652] env[61439]: updated_port = self._update_port( [ 649.233652] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 649.233652] env[61439]: _ensure_no_port_binding_failure(port) [ 649.233652] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 649.233652] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 649.234371] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. [ 649.234371] env[61439]: Removing descriptor: 21 [ 649.234371] env[61439]: ERROR nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] Traceback (most recent call last): [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] yield resources [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self.driver.spawn(context, instance, image_meta, [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 649.234371] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] vm_ref = self.build_virtual_machine(instance, [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] vif_infos = vmwarevif.get_vif_info(self._session, [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] for vif in network_info: [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] return self._sync_wrapper(fn, *args, **kwargs) [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self.wait() [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self[:] = self._gt.wait() [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] return self._exit_event.wait() [ 649.234693] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] result = hub.switch() [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] return self.greenlet.switch() [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] result = function(*args, **kwargs) [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] return func(*args, **kwargs) [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] raise e [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] nwinfo = self.network_api.allocate_for_instance( [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 649.235059] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] created_port_ids = self._update_ports_for_instance( [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] with excutils.save_and_reraise_exception(): [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self.force_reraise() [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] raise self.value [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] updated_port = self._update_port( [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] _ensure_no_port_binding_failure(port) [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 649.235459] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] raise exception.PortBindingFailed(port_id=port['id']) [ 649.235776] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] nova.exception.PortBindingFailed: Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. [ 649.235776] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] [ 649.235776] env[61439]: INFO nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Terminating instance [ 649.235776] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "refresh_cache-10f688a0-63af-496f-93dd-794083acf94b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 649.235776] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquired lock "refresh_cache-10f688a0-63af-496f-93dd-794083acf94b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 649.235776] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 649.241071] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 649.241071] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 649.241071] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 649.241071] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 649.241071] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 649.242159] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.242315] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.242562] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.242768] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.245706] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.246575] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.246762] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 649.253869] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.270861] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 649.270861] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 649.270861] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 649.270861] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 649.272572] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae4ee304-08fd-4c70-8561-3ce988afddb2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.282417] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-339aebd6-99c7-47db-90ff-0bda2174325b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.299636] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e00c86a-d784-4475-8124-60f260b10f0a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.306748] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a436d686-84c6-4d4c-8d64-d72365436013 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.345285] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181551MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 649.345664] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 649.345975] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 649.351251] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 649.424427] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 2afc8edb-3331-476a-bda3-4f8071461084 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 649.424522] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 224afc94-367d-4d30-95fb-b8b865f56eb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 649.464838] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 0b4e2497-f306-4bfc-b862-26e53e827c16 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 649.465388] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 10f688a0-63af-496f-93dd-794083acf94b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 649.465388] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 025daa70-33c7-4a0d-addf-94b680ad8c4e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 649.465388] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 649.466153] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 649.524595] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 649.541153] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Releasing lock "refresh_cache-0b4e2497-f306-4bfc-b862-26e53e827c16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 649.541153] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 649.541153] env[61439]: DEBUG nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 649.541153] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 649.580911] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44cf3ef7-d36c-4121-aeb5-db61a0b1c153 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.592238] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccd3cdd2-b412-4cb1-be60-028c824f37bf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.598874] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 649.634250] env[61439]: DEBUG nova.network.neutron [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 649.636595] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa7b5db2-ef53-45f5-ac94-9094103247e5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.646709] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a62a1c6-eecb-45f0-b835-3aa965d77f41 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.653514] env[61439]: INFO nova.compute.manager [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] [instance: 0b4e2497-f306-4bfc-b862-26e53e827c16] Took 0.11 seconds to deallocate network for instance. [ 649.666008] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 649.676545] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 649.716253] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 649.716253] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.370s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 649.816387] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 649.818252] env[61439]: INFO nova.scheduler.client.report [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Deleted allocations for instance 0b4e2497-f306-4bfc-b862-26e53e827c16 [ 649.845105] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Releasing lock "refresh_cache-10f688a0-63af-496f-93dd-794083acf94b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 649.845435] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 649.845722] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 649.848108] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2822975a-828b-4279-a8cf-35ae280ccd49 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.865419] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cf7778d-7e9d-415f-81e8-c580d0042c9f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.880812] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8febf802-3e3a-4c25-9d0a-bac645a436e3 tempest-ServerExternalEventsTest-723285954 tempest-ServerExternalEventsTest-723285954-project-member] Lock "0b4e2497-f306-4bfc-b862-26e53e827c16" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.779s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 649.897855] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 10f688a0-63af-496f-93dd-794083acf94b could not be found. [ 649.899420] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 649.899420] env[61439]: INFO nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 649.899420] env[61439]: DEBUG oslo.service.loopingcall [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 649.899420] env[61439]: DEBUG nova.compute.manager [-] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 649.899420] env[61439]: DEBUG nova.network.neutron [-] [instance: 10f688a0-63af-496f-93dd-794083acf94b] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 649.972633] env[61439]: DEBUG nova.network.neutron [-] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 649.982807] env[61439]: DEBUG nova.network.neutron [-] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 649.997226] env[61439]: INFO nova.compute.manager [-] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Took 0.10 seconds to deallocate network for instance. [ 650.003977] env[61439]: DEBUG nova.compute.claims [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 650.003977] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 650.004196] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 650.155613] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81c5502d-44e3-42df-89df-d9f42c147f0d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.168360] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfdbd296-a0a0-4bf3-aa70-0e143e6b7647 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.210750] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78753bbb-fd0c-4c2d-bfee-b1ab4ee0eb24 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.223346] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-485e79b8-2392-4b2a-aba2-99ffbace746e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.240226] env[61439]: DEBUG nova.compute.provider_tree [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 650.257753] env[61439]: DEBUG nova.scheduler.client.report [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 650.278748] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.274s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 650.280377] env[61439]: ERROR nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] Traceback (most recent call last): [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self.driver.spawn(context, instance, image_meta, [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] vm_ref = self.build_virtual_machine(instance, [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] vif_infos = vmwarevif.get_vif_info(self._session, [ 650.280377] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] for vif in network_info: [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] return self._sync_wrapper(fn, *args, **kwargs) [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self.wait() [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self[:] = self._gt.wait() [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] return self._exit_event.wait() [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] result = hub.switch() [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 650.280735] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] return self.greenlet.switch() [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] result = function(*args, **kwargs) [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] return func(*args, **kwargs) [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] raise e [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] nwinfo = self.network_api.allocate_for_instance( [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] created_port_ids = self._update_ports_for_instance( [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] with excutils.save_and_reraise_exception(): [ 650.281068] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] self.force_reraise() [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] raise self.value [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] updated_port = self._update_port( [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] _ensure_no_port_binding_failure(port) [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] raise exception.PortBindingFailed(port_id=port['id']) [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] nova.exception.PortBindingFailed: Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. [ 650.281398] env[61439]: ERROR nova.compute.manager [instance: 10f688a0-63af-496f-93dd-794083acf94b] [ 650.281681] env[61439]: DEBUG nova.compute.utils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 650.283432] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Build of instance 10f688a0-63af-496f-93dd-794083acf94b was re-scheduled: Binding failed for port 54a2734c-5736-4c20-a208-d59449f937f8, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 650.284379] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 650.284657] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "refresh_cache-10f688a0-63af-496f-93dd-794083acf94b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 650.284809] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquired lock "refresh_cache-10f688a0-63af-496f-93dd-794083acf94b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 650.284973] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 650.329018] env[61439]: ERROR nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. [ 650.329018] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 650.329018] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 650.329018] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 650.329018] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 650.329018] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 650.329018] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 650.329018] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 650.329018] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 650.329018] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 650.329018] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 650.329018] env[61439]: ERROR nova.compute.manager raise self.value [ 650.329018] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 650.329018] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 650.329018] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 650.329018] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 650.329580] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 650.329580] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 650.329580] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. [ 650.329580] env[61439]: ERROR nova.compute.manager [ 650.329580] env[61439]: Traceback (most recent call last): [ 650.329580] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 650.329580] env[61439]: listener.cb(fileno) [ 650.329580] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 650.329580] env[61439]: result = function(*args, **kwargs) [ 650.329580] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 650.329580] env[61439]: return func(*args, **kwargs) [ 650.329580] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 650.329580] env[61439]: raise e [ 650.329580] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 650.329580] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 650.329580] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 650.329580] env[61439]: created_port_ids = self._update_ports_for_instance( [ 650.329580] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 650.329580] env[61439]: with excutils.save_and_reraise_exception(): [ 650.329580] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 650.329580] env[61439]: self.force_reraise() [ 650.329580] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 650.329580] env[61439]: raise self.value [ 650.329580] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 650.329580] env[61439]: updated_port = self._update_port( [ 650.329580] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 650.329580] env[61439]: _ensure_no_port_binding_failure(port) [ 650.329580] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 650.329580] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 650.330282] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. [ 650.330282] env[61439]: Removing descriptor: 18 [ 650.330282] env[61439]: ERROR nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Traceback (most recent call last): [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] yield resources [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self.driver.spawn(context, instance, image_meta, [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 650.330282] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] vm_ref = self.build_virtual_machine(instance, [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] vif_infos = vmwarevif.get_vif_info(self._session, [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] for vif in network_info: [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] return self._sync_wrapper(fn, *args, **kwargs) [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self.wait() [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self[:] = self._gt.wait() [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] return self._exit_event.wait() [ 650.330622] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] result = hub.switch() [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] return self.greenlet.switch() [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] result = function(*args, **kwargs) [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] return func(*args, **kwargs) [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] raise e [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] nwinfo = self.network_api.allocate_for_instance( [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 650.330968] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] created_port_ids = self._update_ports_for_instance( [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] with excutils.save_and_reraise_exception(): [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self.force_reraise() [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] raise self.value [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] updated_port = self._update_port( [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] _ensure_no_port_binding_failure(port) [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 650.331464] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] raise exception.PortBindingFailed(port_id=port['id']) [ 650.331862] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] nova.exception.PortBindingFailed: Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. [ 650.331862] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] [ 650.331862] env[61439]: INFO nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Terminating instance [ 650.336100] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "refresh_cache-224afc94-367d-4d30-95fb-b8b865f56eb9" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 650.336168] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquired lock "refresh_cache-224afc94-367d-4d30-95fb-b8b865f56eb9" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 650.336326] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 650.392277] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 650.402049] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 650.605958] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 650.622650] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Releasing lock "refresh_cache-224afc94-367d-4d30-95fb-b8b865f56eb9" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.622650] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 650.622650] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 650.623424] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d3a3aa9d-8039-4588-9863-af1f29b6b658 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.633344] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf130ca8-9a00-49b6-a2b0-4956afba4a6d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.656989] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 224afc94-367d-4d30-95fb-b8b865f56eb9 could not be found. [ 650.657214] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 650.657398] env[61439]: INFO nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Took 0.03 seconds to destroy the instance on the hypervisor. [ 650.657655] env[61439]: DEBUG oslo.service.loopingcall [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 650.657868] env[61439]: DEBUG nova.compute.manager [-] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 650.657963] env[61439]: DEBUG nova.network.neutron [-] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 650.676359] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 650.689951] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Releasing lock "refresh_cache-10f688a0-63af-496f-93dd-794083acf94b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.689951] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 650.689951] env[61439]: DEBUG nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 650.689951] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 650.709334] env[61439]: DEBUG nova.network.neutron [-] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 650.719231] env[61439]: DEBUG nova.network.neutron [-] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 650.729493] env[61439]: INFO nova.compute.manager [-] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Took 0.07 seconds to deallocate network for instance. [ 650.735031] env[61439]: DEBUG nova.compute.claims [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 650.735229] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 650.735447] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 650.747282] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 650.756119] env[61439]: DEBUG nova.network.neutron [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 650.776665] env[61439]: INFO nova.compute.manager [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 10f688a0-63af-496f-93dd-794083acf94b] Took 0.09 seconds to deallocate network for instance. [ 650.863978] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c1d663b-b3a3-4fda-b85f-cc29ef827048 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.878127] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92160409-5a02-4d65-83c8-696d8c20cdf6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.913560] env[61439]: INFO nova.scheduler.client.report [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Deleted allocations for instance 10f688a0-63af-496f-93dd-794083acf94b [ 650.919180] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-089289c9-1ec9-4bf8-ae7c-888d53bedddb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.928383] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-281dc317-039d-487e-b5f0-45697299fd92 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.944251] env[61439]: DEBUG nova.compute.provider_tree [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 650.946359] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d7088f1-7eb7-4dee-9946-47c07d4fd4c9 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "10f688a0-63af-496f-93dd-794083acf94b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.401s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 650.958692] env[61439]: DEBUG nova.scheduler.client.report [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 650.981612] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.246s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 650.982322] env[61439]: ERROR nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Traceback (most recent call last): [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self.driver.spawn(context, instance, image_meta, [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] vm_ref = self.build_virtual_machine(instance, [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] vif_infos = vmwarevif.get_vif_info(self._session, [ 650.982322] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] for vif in network_info: [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] return self._sync_wrapper(fn, *args, **kwargs) [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self.wait() [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self[:] = self._gt.wait() [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] return self._exit_event.wait() [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] result = hub.switch() [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 650.982877] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] return self.greenlet.switch() [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] result = function(*args, **kwargs) [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] return func(*args, **kwargs) [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] raise e [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] nwinfo = self.network_api.allocate_for_instance( [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] created_port_ids = self._update_ports_for_instance( [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] with excutils.save_and_reraise_exception(): [ 650.983371] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] self.force_reraise() [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] raise self.value [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] updated_port = self._update_port( [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] _ensure_no_port_binding_failure(port) [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] raise exception.PortBindingFailed(port_id=port['id']) [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] nova.exception.PortBindingFailed: Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. [ 650.983694] env[61439]: ERROR nova.compute.manager [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] [ 650.983958] env[61439]: DEBUG nova.compute.utils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 650.985728] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Build of instance 224afc94-367d-4d30-95fb-b8b865f56eb9 was re-scheduled: Binding failed for port 88a8eeb7-54a6-49a9-aae8-984e651885b1, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 650.985728] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 650.985728] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "refresh_cache-224afc94-367d-4d30-95fb-b8b865f56eb9" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 650.985728] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquired lock "refresh_cache-224afc94-367d-4d30-95fb-b8b865f56eb9" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 650.985947] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 651.076224] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 651.300695] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 651.312110] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Releasing lock "refresh_cache-224afc94-367d-4d30-95fb-b8b865f56eb9" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 651.312579] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 651.312809] env[61439]: DEBUG nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 651.313349] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 651.412879] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 651.422200] env[61439]: DEBUG nova.network.neutron [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 651.432949] env[61439]: INFO nova.compute.manager [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 224afc94-367d-4d30-95fb-b8b865f56eb9] Took 0.12 seconds to deallocate network for instance. [ 651.579025] env[61439]: INFO nova.scheduler.client.report [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Deleted allocations for instance 224afc94-367d-4d30-95fb-b8b865f56eb9 [ 651.614455] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3ba9f10e-7ab1-445e-994d-cf5cea16be51 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "224afc94-367d-4d30-95fb-b8b865f56eb9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.300s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.231616] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Acquiring lock "3d2438d5-9a7c-4cc6-b229-ab2389a78801" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.231841] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Lock "3d2438d5-9a7c-4cc6-b229-ab2389a78801" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.248897] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 652.337487] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.337487] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.337487] env[61439]: INFO nova.compute.claims [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 652.511307] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e91c60c3-b347-4cdf-8e96-78626eed5e10 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.520430] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-440739ae-5f5c-4534-ab72-2eb8105f1199 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.561321] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e25b97c4-86d1-460a-a5ed-4b71e4625305 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.572084] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b80a3de1-7bd3-4868-95e7-0ffdfa453a77 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.589320] env[61439]: DEBUG nova.compute.provider_tree [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 652.610118] env[61439]: DEBUG nova.scheduler.client.report [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 652.630715] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.631258] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 652.683188] env[61439]: DEBUG nova.compute.utils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 652.685100] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 652.685100] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 652.696910] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 652.780500] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 652.815089] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 652.815342] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 652.815505] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 652.815685] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 652.815902] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 652.815966] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 652.817031] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 652.818115] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 652.818115] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 652.818115] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 652.818115] env[61439]: DEBUG nova.virt.hardware [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 652.818934] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd8e6833-5bb2-4a60-9450-aa8149c263aa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.829217] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1184e45-7af4-4a81-91b8-6ae6755460b6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.935510] env[61439]: DEBUG nova.policy [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '91295abcd6c74795abb5da3e18f72f38', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26c233a104004a87a12bd1f2fd2c0dbb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 653.288790] env[61439]: ERROR nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. [ 653.288790] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 653.288790] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 653.288790] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 653.288790] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 653.288790] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 653.288790] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 653.288790] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 653.288790] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 653.288790] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 653.288790] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 653.288790] env[61439]: ERROR nova.compute.manager raise self.value [ 653.288790] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 653.288790] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 653.288790] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 653.288790] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 653.289288] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 653.289288] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 653.289288] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. [ 653.289288] env[61439]: ERROR nova.compute.manager [ 653.289577] env[61439]: Traceback (most recent call last): [ 653.290060] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 653.290060] env[61439]: listener.cb(fileno) [ 653.290060] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 653.290060] env[61439]: result = function(*args, **kwargs) [ 653.290060] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 653.290060] env[61439]: return func(*args, **kwargs) [ 653.290060] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 653.290060] env[61439]: raise e [ 653.290060] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 653.290060] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 653.290060] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 653.290060] env[61439]: created_port_ids = self._update_ports_for_instance( [ 653.290060] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 653.290060] env[61439]: with excutils.save_and_reraise_exception(): [ 653.290486] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 653.290486] env[61439]: self.force_reraise() [ 653.290486] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 653.290486] env[61439]: raise self.value [ 653.290486] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 653.290486] env[61439]: updated_port = self._update_port( [ 653.290486] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 653.290486] env[61439]: _ensure_no_port_binding_failure(port) [ 653.290486] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 653.290486] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 653.290486] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. [ 653.290486] env[61439]: Removing descriptor: 22 [ 653.291493] env[61439]: ERROR nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Traceback (most recent call last): [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] yield resources [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self.driver.spawn(context, instance, image_meta, [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] vm_ref = self.build_virtual_machine(instance, [ 653.291493] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] vif_infos = vmwarevif.get_vif_info(self._session, [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] for vif in network_info: [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] return self._sync_wrapper(fn, *args, **kwargs) [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self.wait() [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self[:] = self._gt.wait() [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] return self._exit_event.wait() [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 653.291842] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] result = hub.switch() [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] return self.greenlet.switch() [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] result = function(*args, **kwargs) [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] return func(*args, **kwargs) [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] raise e [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] nwinfo = self.network_api.allocate_for_instance( [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] created_port_ids = self._update_ports_for_instance( [ 653.292247] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] with excutils.save_and_reraise_exception(): [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self.force_reraise() [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] raise self.value [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] updated_port = self._update_port( [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] _ensure_no_port_binding_failure(port) [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] raise exception.PortBindingFailed(port_id=port['id']) [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] nova.exception.PortBindingFailed: Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. [ 653.292588] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] [ 653.293741] env[61439]: INFO nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Terminating instance [ 653.295292] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Acquiring lock "refresh_cache-025daa70-33c7-4a0d-addf-94b680ad8c4e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 653.295861] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Acquired lock "refresh_cache-025daa70-33c7-4a0d-addf-94b680ad8c4e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 653.295861] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 653.413791] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 653.909695] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.923504] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Releasing lock "refresh_cache-025daa70-33c7-4a0d-addf-94b680ad8c4e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 653.923504] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 653.923598] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 653.925369] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-543d9be1-f61b-416a-96b7-89fb2f41a308 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.936826] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31ede734-b9d6-4035-88ed-511a4627685b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.970038] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 025daa70-33c7-4a0d-addf-94b680ad8c4e could not be found. [ 653.970038] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 653.970269] env[61439]: INFO nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Took 0.05 seconds to destroy the instance on the hypervisor. [ 653.970603] env[61439]: DEBUG oslo.service.loopingcall [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 653.970870] env[61439]: DEBUG nova.compute.manager [-] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 653.970990] env[61439]: DEBUG nova.network.neutron [-] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 654.031945] env[61439]: DEBUG nova.network.neutron [-] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 654.045748] env[61439]: DEBUG nova.network.neutron [-] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 654.062574] env[61439]: INFO nova.compute.manager [-] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Took 0.09 seconds to deallocate network for instance. [ 654.065773] env[61439]: DEBUG nova.compute.claims [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 654.065773] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.065773] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.150252] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Successfully created port: e2841af4-5a95-4a05-bbd9-6e514778a012 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 654.210942] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39f9fa80-66f4-47dc-b4af-8a6ab7115d27 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.219448] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b713d54a-b77e-4917-ab03-7f8ff5efbcda {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.824625] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ebdc4f7-82aa-4f70-ba31-b594c222e679 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.832915] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1dbfc06-191a-4905-a5f2-430b41b2e7ea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.849121] env[61439]: DEBUG nova.compute.provider_tree [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 654.871900] env[61439]: DEBUG nova.scheduler.client.report [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 654.899896] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.834s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 654.900604] env[61439]: ERROR nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Traceback (most recent call last): [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self.driver.spawn(context, instance, image_meta, [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] vm_ref = self.build_virtual_machine(instance, [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] vif_infos = vmwarevif.get_vif_info(self._session, [ 654.900604] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] for vif in network_info: [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] return self._sync_wrapper(fn, *args, **kwargs) [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self.wait() [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self[:] = self._gt.wait() [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] return self._exit_event.wait() [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] result = hub.switch() [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 654.900942] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] return self.greenlet.switch() [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] result = function(*args, **kwargs) [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] return func(*args, **kwargs) [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] raise e [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] nwinfo = self.network_api.allocate_for_instance( [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] created_port_ids = self._update_ports_for_instance( [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] with excutils.save_and_reraise_exception(): [ 654.901331] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] self.force_reraise() [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] raise self.value [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] updated_port = self._update_port( [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] _ensure_no_port_binding_failure(port) [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] raise exception.PortBindingFailed(port_id=port['id']) [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] nova.exception.PortBindingFailed: Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. [ 654.901652] env[61439]: ERROR nova.compute.manager [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] [ 654.901917] env[61439]: DEBUG nova.compute.utils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 654.904875] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Build of instance 025daa70-33c7-4a0d-addf-94b680ad8c4e was re-scheduled: Binding failed for port 3aa58b3c-4708-4d52-ac7e-6e0f2eee48f5, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 654.904875] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 654.904875] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Acquiring lock "refresh_cache-025daa70-33c7-4a0d-addf-94b680ad8c4e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 654.904875] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Acquired lock "refresh_cache-025daa70-33c7-4a0d-addf-94b680ad8c4e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 654.905599] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 654.979801] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 655.435433] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.458280] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Releasing lock "refresh_cache-025daa70-33c7-4a0d-addf-94b680ad8c4e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 655.458280] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 655.458280] env[61439]: DEBUG nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 655.458280] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 655.542264] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 655.559692] env[61439]: DEBUG nova.network.neutron [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.579237] env[61439]: INFO nova.compute.manager [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] [instance: 025daa70-33c7-4a0d-addf-94b680ad8c4e] Took 0.12 seconds to deallocate network for instance. [ 655.694568] env[61439]: INFO nova.scheduler.client.report [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Deleted allocations for instance 025daa70-33c7-4a0d-addf-94b680ad8c4e [ 655.719279] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f1b0dd4-4f52-4264-9da2-e4bbd58a9771 tempest-ImagesOneServerNegativeTestJSON-1421586542 tempest-ImagesOneServerNegativeTestJSON-1421586542-project-member] Lock "025daa70-33c7-4a0d-addf-94b680ad8c4e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.298s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 657.174069] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "c3fd73f2-027d-42d3-ad53-4925403a9c92" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 657.174069] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "c3fd73f2-027d-42d3-ad53-4925403a9c92" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 657.192041] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 657.281463] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 657.281724] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 657.283505] env[61439]: INFO nova.compute.claims [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 657.467847] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-618c4fcf-585c-4244-b5e7-f4d643e0d0dc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 657.480018] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67167415-b9bc-4023-8254-e9a2537d9011 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 657.513343] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8368881-4611-468e-9087-43098716ac4a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 657.521654] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-932cadc3-c03f-416b-8aa0-140c0571b11a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 657.536821] env[61439]: DEBUG nova.compute.provider_tree [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 657.551015] env[61439]: DEBUG nova.scheduler.client.report [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 657.574447] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 657.574447] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 657.626139] env[61439]: DEBUG nova.compute.utils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 657.629702] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 657.629702] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 657.638870] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 657.712920] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 657.740083] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 657.740845] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 657.740906] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 657.742293] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 657.742293] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 657.742293] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 657.742293] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 657.742293] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 657.742720] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 657.742720] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 657.742720] env[61439]: DEBUG nova.virt.hardware [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 657.743515] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6778e9ff-eb28-466d-bfe2-1ff750342311 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 657.758097] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aafb63b0-4fc4-495d-8681-5546d81ef4a5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 657.843014] env[61439]: DEBUG nova.policy [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d0376ef3212459b883f3b757a17316f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c85d469fb8e045f7b6981676c526d780', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 658.250694] env[61439]: ERROR nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. [ 658.250694] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 658.250694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 658.250694] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 658.250694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 658.250694] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 658.250694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 658.250694] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 658.250694] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 658.250694] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 658.250694] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 658.250694] env[61439]: ERROR nova.compute.manager raise self.value [ 658.250694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 658.250694] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 658.250694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 658.250694] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 658.252020] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 658.252020] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 658.252020] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. [ 658.252020] env[61439]: ERROR nova.compute.manager [ 658.252020] env[61439]: Traceback (most recent call last): [ 658.252020] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 658.252020] env[61439]: listener.cb(fileno) [ 658.252020] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 658.252020] env[61439]: result = function(*args, **kwargs) [ 658.252020] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 658.252020] env[61439]: return func(*args, **kwargs) [ 658.252020] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 658.252020] env[61439]: raise e [ 658.252020] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 658.252020] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 658.252020] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 658.252020] env[61439]: created_port_ids = self._update_ports_for_instance( [ 658.252020] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 658.252020] env[61439]: with excutils.save_and_reraise_exception(): [ 658.252020] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 658.252020] env[61439]: self.force_reraise() [ 658.252020] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 658.252020] env[61439]: raise self.value [ 658.252020] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 658.252020] env[61439]: updated_port = self._update_port( [ 658.252020] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 658.252020] env[61439]: _ensure_no_port_binding_failure(port) [ 658.252020] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 658.252020] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 658.252753] env[61439]: nova.exception.PortBindingFailed: Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. [ 658.252753] env[61439]: Removing descriptor: 21 [ 658.252753] env[61439]: ERROR nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Traceback (most recent call last): [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] yield resources [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self.driver.spawn(context, instance, image_meta, [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self._vmops.spawn(context, instance, image_meta, injected_files, [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 658.252753] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] vm_ref = self.build_virtual_machine(instance, [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] vif_infos = vmwarevif.get_vif_info(self._session, [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] for vif in network_info: [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] return self._sync_wrapper(fn, *args, **kwargs) [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self.wait() [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self[:] = self._gt.wait() [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] return self._exit_event.wait() [ 658.253057] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] result = hub.switch() [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] return self.greenlet.switch() [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] result = function(*args, **kwargs) [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] return func(*args, **kwargs) [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] raise e [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] nwinfo = self.network_api.allocate_for_instance( [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 658.253426] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] created_port_ids = self._update_ports_for_instance( [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] with excutils.save_and_reraise_exception(): [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self.force_reraise() [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] raise self.value [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] updated_port = self._update_port( [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] _ensure_no_port_binding_failure(port) [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 658.254041] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] raise exception.PortBindingFailed(port_id=port['id']) [ 658.254385] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] nova.exception.PortBindingFailed: Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. [ 658.254385] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] [ 658.254385] env[61439]: INFO nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Terminating instance [ 658.258153] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Acquiring lock "refresh_cache-3d2438d5-9a7c-4cc6-b229-ab2389a78801" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 658.258327] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Acquired lock "refresh_cache-3d2438d5-9a7c-4cc6-b229-ab2389a78801" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 658.258497] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 658.316462] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 658.581173] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.600221] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Releasing lock "refresh_cache-3d2438d5-9a7c-4cc6-b229-ab2389a78801" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 658.600608] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 658.601316] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 658.601316] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f4e9ae0d-f4ab-4645-8439-0f2a6bad0bc7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 658.615352] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca541d75-c8ed-48c0-b0af-119112726b2b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 658.640333] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3d2438d5-9a7c-4cc6-b229-ab2389a78801 could not be found. [ 658.640507] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 658.640659] env[61439]: INFO nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Took 0.04 seconds to destroy the instance on the hypervisor. [ 658.640912] env[61439]: DEBUG oslo.service.loopingcall [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 658.641660] env[61439]: DEBUG nova.compute.manager [-] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 658.641660] env[61439]: DEBUG nova.network.neutron [-] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 658.664711] env[61439]: DEBUG nova.network.neutron [-] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 658.677152] env[61439]: DEBUG nova.network.neutron [-] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.696966] env[61439]: INFO nova.compute.manager [-] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Took 0.06 seconds to deallocate network for instance. [ 658.701017] env[61439]: DEBUG nova.compute.claims [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 658.701017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 658.701017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 658.795425] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Successfully created port: 6787e179-849f-4357-9cfb-6b6f599dc8c9 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 658.840589] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e0c942f-5ce8-428f-b1de-0fa305b0d3f7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 658.850835] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18d3d6c0-b0b7-4b2a-8915-bc4bd12b6db6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 658.906368] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28a5d127-9c64-41cb-823c-217d3853e2e5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 658.917703] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af22701a-9b66-4d01-a6dc-db069945b018 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 658.943093] env[61439]: DEBUG nova.compute.provider_tree [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 658.961462] env[61439]: DEBUG nova.scheduler.client.report [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 658.982404] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.282s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 658.983043] env[61439]: ERROR nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Traceback (most recent call last): [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self.driver.spawn(context, instance, image_meta, [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self._vmops.spawn(context, instance, image_meta, injected_files, [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] vm_ref = self.build_virtual_machine(instance, [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] vif_infos = vmwarevif.get_vif_info(self._session, [ 658.983043] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] for vif in network_info: [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] return self._sync_wrapper(fn, *args, **kwargs) [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self.wait() [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self[:] = self._gt.wait() [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] return self._exit_event.wait() [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] result = hub.switch() [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 658.983450] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] return self.greenlet.switch() [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] result = function(*args, **kwargs) [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] return func(*args, **kwargs) [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] raise e [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] nwinfo = self.network_api.allocate_for_instance( [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] created_port_ids = self._update_ports_for_instance( [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] with excutils.save_and_reraise_exception(): [ 658.983783] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] self.force_reraise() [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] raise self.value [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] updated_port = self._update_port( [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] _ensure_no_port_binding_failure(port) [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] raise exception.PortBindingFailed(port_id=port['id']) [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] nova.exception.PortBindingFailed: Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. [ 658.984150] env[61439]: ERROR nova.compute.manager [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] [ 658.984981] env[61439]: DEBUG nova.compute.utils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 658.986708] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Build of instance 3d2438d5-9a7c-4cc6-b229-ab2389a78801 was re-scheduled: Binding failed for port e2841af4-5a95-4a05-bbd9-6e514778a012, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 658.989670] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 658.989919] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Acquiring lock "refresh_cache-3d2438d5-9a7c-4cc6-b229-ab2389a78801" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 658.990106] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Acquired lock "refresh_cache-3d2438d5-9a7c-4cc6-b229-ab2389a78801" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 658.990256] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 659.007266] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "fb8aee53-11e7-4e32-9225-06671bb511d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.007551] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "fb8aee53-11e7-4e32-9225-06671bb511d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 659.021739] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 659.031182] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 659.105343] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.105343] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 659.108104] env[61439]: INFO nova.compute.claims [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 659.183159] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 659.194908] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Releasing lock "refresh_cache-3d2438d5-9a7c-4cc6-b229-ab2389a78801" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 659.194908] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 659.196753] env[61439]: DEBUG nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 659.196971] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 659.224852] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 659.242274] env[61439]: DEBUG nova.network.neutron [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 659.257446] env[61439]: INFO nova.compute.manager [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] [instance: 3d2438d5-9a7c-4cc6-b229-ab2389a78801] Took 0.06 seconds to deallocate network for instance. [ 659.274872] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e291e631-e24a-4078-8138-87e2731f0652 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.286960] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9550b8f9-ed7c-4130-ae9e-dfa94e6b492e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.323085] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2c40688-20e9-4a48-bc87-2036d08df554 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.332088] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0724aa3-0e79-48d3-a42e-c591126d38ac {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.348924] env[61439]: DEBUG nova.compute.provider_tree [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 659.362747] env[61439]: DEBUG nova.scheduler.client.report [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 659.392652] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 659.393227] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 659.411877] env[61439]: INFO nova.scheduler.client.report [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Deleted allocations for instance 3d2438d5-9a7c-4cc6-b229-ab2389a78801 [ 659.459140] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ef9b7a46-0c2e-40a2-ab82-a8e14bb3ba53 tempest-ServersTestFqdnHostnames-151657379 tempest-ServersTestFqdnHostnames-151657379-project-member] Lock "3d2438d5-9a7c-4cc6-b229-ab2389a78801" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.227s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 659.478295] env[61439]: DEBUG nova.compute.utils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 659.478295] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 659.478295] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 659.494337] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 659.596664] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 659.615125] env[61439]: DEBUG nova.policy [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '564a426ea7b54c38a2bb07a08fc935be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00123cdebdcf4b01ad920d353fcacff4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 659.634497] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 659.634497] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 659.634497] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 659.634710] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 659.634911] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 659.635065] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 659.635381] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 659.635575] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 659.635749] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 659.635930] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 659.636123] env[61439]: DEBUG nova.virt.hardware [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 659.637291] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c02da3d2-08e4-4c12-85fb-60cbf553b4a7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.653686] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f597d0c5-468d-4a63-aeee-7cc4b1fd59c9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.725994] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Acquiring lock "4bce210a-e6bc-4c67-ad48-397422291467" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.726303] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Lock "4bce210a-e6bc-4c67-ad48-397422291467" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 659.739244] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 659.806402] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.809807] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 659.809807] env[61439]: INFO nova.compute.claims [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 659.995426] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4085a64-15d8-4238-88ab-c0a03ce423ea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.008221] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c40f596-fcf4-477c-a18e-79ae00a1ed9d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.048724] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b066704-4e89-4f44-af4e-9af5fc8f3a6c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.057264] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3f2ae0-5a57-4b3f-aaec-b3e0758d1576 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.079936] env[61439]: DEBUG nova.compute.provider_tree [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 660.097741] env[61439]: DEBUG nova.scheduler.client.report [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 660.126038] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 660.126038] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 660.176062] env[61439]: DEBUG nova.compute.utils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 660.177214] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 660.177214] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 660.199486] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 660.294735] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 660.338430] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 660.338683] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 660.338842] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 660.340258] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 660.340367] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 660.340534] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 660.340750] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 660.340916] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 660.341106] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 660.341284] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 660.341478] env[61439]: DEBUG nova.virt.hardware [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 660.342392] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c278b9e-9b24-4034-b1dc-225c3f2504ad {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.357087] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94c853c2-a7f6-4f0d-9c8d-55e270d246e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.485607] env[61439]: DEBUG nova.policy [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff121f6262f3409e93a2b29db03b6326', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5bf81a56b9294a7a846c62485175a74c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 660.737298] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Acquiring lock "7c682f87-0bf4-460c-97a7-bd369d91a6a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 660.737298] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Lock "7c682f87-0bf4-460c-97a7-bd369d91a6a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 660.755855] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 660.836253] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 660.836253] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 660.837124] env[61439]: INFO nova.compute.claims [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 660.897797] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Successfully created port: 4f72723f-9f91-4066-a9a6-fe0bf74afafd {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 661.007011] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-353c52a8-7c1d-4e60-a51e-d2d04fb58b18 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.016637] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b7e795c-d4ae-4243-9bb5-968463662dfb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.052501] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ceff699-4582-413d-8989-99472bcd8655 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.059941] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95f5c9dd-89f7-427b-b95d-b50b760b5d3b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.075990] env[61439]: DEBUG nova.compute.provider_tree [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 661.085699] env[61439]: DEBUG nova.scheduler.client.report [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 661.106664] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 661.107335] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 661.149438] env[61439]: DEBUG nova.compute.utils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 661.150603] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 661.150777] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 661.162246] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 661.245677] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 661.278772] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 661.279244] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 661.279402] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 661.280827] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 661.280827] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 661.280827] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 661.280827] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 661.280827] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 661.281080] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 661.281080] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 661.281080] env[61439]: DEBUG nova.virt.hardware [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 661.281697] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f9af029-e23e-4069-a60e-723375c3de31 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.292831] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b52e0e4c-af75-45c5-8c35-8b6bb09bbd33 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.307818] env[61439]: DEBUG nova.policy [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1d6f330f0a944ff09915d62b9fe32137', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c39211357c76487183cc1181d114332a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 662.172114] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Successfully created port: dc8ac46f-f616-4f92-ad64-f212950b7634 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 662.536561] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "edb47400-6749-4235-bbb1-ffa648f3dba5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 662.537092] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "edb47400-6749-4235-bbb1-ffa648f3dba5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 662.540750] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Successfully created port: 9b109a6b-fa18-4dfc-9ac9-4beedf6be738 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 662.556987] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 662.636399] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 662.636821] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 662.638997] env[61439]: INFO nova.compute.claims [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 662.884921] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f00599e6-8f52-4858-85b4-1a27410492ec {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.896775] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a67e602f-7592-46ee-ab45-2499ab58d6c1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.928808] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a948ef33-b3d1-40a6-b3c0-e602a03c23ec {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.936988] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f74d4624-9b59-4d33-a739-92887a10042f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 662.950682] env[61439]: DEBUG nova.compute.provider_tree [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 662.966026] env[61439]: DEBUG nova.scheduler.client.report [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 662.985328] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 662.986631] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 663.035314] env[61439]: DEBUG nova.compute.utils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 663.036572] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 663.036763] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 663.051204] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 663.131716] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 663.165622] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 663.167070] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 663.167070] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 663.167070] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 663.167070] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 663.167070] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 663.167400] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 663.167400] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 663.167887] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 663.167962] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 663.168127] env[61439]: DEBUG nova.virt.hardware [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 663.169322] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-482a86ca-3b6f-4099-9a38-45b2a707dc6a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 663.178175] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dde6e4f8-338a-40d2-801a-bb391528239c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 663.288244] env[61439]: DEBUG nova.policy [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e64ca57e567146098521cd7356b9e3e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e1db803f0ff4f29bb70e0a0d94c57e0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 663.747837] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "05067c00-8595-412c-ad68-095cba5bf6da" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 663.748159] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "05067c00-8595-412c-ad68-095cba5bf6da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 663.759210] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 663.833426] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 663.833687] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 663.835148] env[61439]: INFO nova.compute.claims [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 663.985554] env[61439]: ERROR nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. [ 663.985554] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 663.985554] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 663.985554] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 663.985554] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 663.985554] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 663.985554] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 663.985554] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 663.985554] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 663.985554] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 663.985554] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 663.985554] env[61439]: ERROR nova.compute.manager raise self.value [ 663.985554] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 663.985554] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 663.985554] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 663.985554] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 663.986029] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 663.986029] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 663.986029] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. [ 663.986029] env[61439]: ERROR nova.compute.manager [ 663.986029] env[61439]: Traceback (most recent call last): [ 663.986029] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 663.986029] env[61439]: listener.cb(fileno) [ 663.986029] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 663.986029] env[61439]: result = function(*args, **kwargs) [ 663.986029] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 663.986029] env[61439]: return func(*args, **kwargs) [ 663.986029] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 663.986029] env[61439]: raise e [ 663.986029] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 663.986029] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 663.986029] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 663.986029] env[61439]: created_port_ids = self._update_ports_for_instance( [ 663.986029] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 663.986029] env[61439]: with excutils.save_and_reraise_exception(): [ 663.986029] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 663.986029] env[61439]: self.force_reraise() [ 663.986029] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 663.986029] env[61439]: raise self.value [ 663.986029] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 663.986029] env[61439]: updated_port = self._update_port( [ 663.986029] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 663.986029] env[61439]: _ensure_no_port_binding_failure(port) [ 663.986029] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 663.986029] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 663.986988] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. [ 663.986988] env[61439]: Removing descriptor: 22 [ 663.986988] env[61439]: ERROR nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Traceback (most recent call last): [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] yield resources [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self.driver.spawn(context, instance, image_meta, [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self._vmops.spawn(context, instance, image_meta, injected_files, [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 663.986988] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] vm_ref = self.build_virtual_machine(instance, [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] vif_infos = vmwarevif.get_vif_info(self._session, [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] for vif in network_info: [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] return self._sync_wrapper(fn, *args, **kwargs) [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self.wait() [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self[:] = self._gt.wait() [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] return self._exit_event.wait() [ 663.987331] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] result = hub.switch() [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] return self.greenlet.switch() [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] result = function(*args, **kwargs) [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] return func(*args, **kwargs) [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] raise e [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] nwinfo = self.network_api.allocate_for_instance( [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 663.987669] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] created_port_ids = self._update_ports_for_instance( [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] with excutils.save_and_reraise_exception(): [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self.force_reraise() [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] raise self.value [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] updated_port = self._update_port( [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] _ensure_no_port_binding_failure(port) [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 663.988035] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] raise exception.PortBindingFailed(port_id=port['id']) [ 663.988399] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] nova.exception.PortBindingFailed: Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. [ 663.988399] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] [ 663.988399] env[61439]: INFO nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Terminating instance [ 663.996199] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "refresh_cache-c3fd73f2-027d-42d3-ad53-4925403a9c92" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 663.996199] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquired lock "refresh_cache-c3fd73f2-027d-42d3-ad53-4925403a9c92" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 663.996199] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 664.033725] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e97f4cd4-bfc0-45a0-891f-099b509a7560 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.044287] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8602da6-45ba-4427-848c-0f4c66266a51 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.077694] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 664.079753] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67cc54fa-adf5-4873-a75e-528835ae3ec9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.096046] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a0a74ae-c806-4ab4-9cbc-8be0d6cb1420 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.114200] env[61439]: DEBUG nova.compute.provider_tree [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 664.126100] env[61439]: DEBUG nova.scheduler.client.report [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 664.158875] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.325s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 664.159404] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 664.214461] env[61439]: DEBUG nova.compute.utils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 664.215507] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 664.215507] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 664.240739] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 664.286866] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "605eecc4-0fff-44bc-9d79-799c4762707a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 664.287072] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "605eecc4-0fff-44bc-9d79-799c4762707a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 664.300364] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 664.339921] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 664.377291] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 664.377735] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 664.382323] env[61439]: INFO nova.compute.claims [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 664.403453] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 664.403793] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 664.403991] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 664.404191] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 664.404354] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 664.404520] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 664.404774] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 664.405015] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 664.405129] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 664.405313] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 664.405489] env[61439]: DEBUG nova.virt.hardware [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 664.407330] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-401f89f5-1a71-475a-bcf6-3ca03db3bc59 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.419242] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a73d8962-2a8e-4e25-8ab5-f4869d6d97cd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.443155] env[61439]: DEBUG nova.policy [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '564a426ea7b54c38a2bb07a08fc935be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00123cdebdcf4b01ad920d353fcacff4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 664.624910] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77523a14-c5ee-47da-b211-eb4ff638a37b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.639797] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50c2fe92-a49f-4f9e-83f0-58a7f676e784 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.686345] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb5c195c-e568-4c3c-a905-f5cc6723fd2e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.697749] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 664.700833] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b600828-f078-4466-8614-edb9bfd6fc89 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.718500] env[61439]: DEBUG nova.compute.provider_tree [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 664.720949] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Releasing lock "refresh_cache-c3fd73f2-027d-42d3-ad53-4925403a9c92" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 664.721379] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 664.721570] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 664.722313] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5d3666cb-2d0c-4f69-a16d-2fd3a98379e3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.731339] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8720cf9-891b-4a50-adb1-c59f0e9adb10 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.748416] env[61439]: DEBUG nova.scheduler.client.report [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 664.765965] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c3fd73f2-027d-42d3-ad53-4925403a9c92 could not be found. [ 664.766213] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 664.767106] env[61439]: INFO nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Took 0.04 seconds to destroy the instance on the hypervisor. [ 664.767106] env[61439]: DEBUG oslo.service.loopingcall [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 664.767357] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.390s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 664.767812] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 664.770543] env[61439]: DEBUG nova.compute.manager [-] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 664.770543] env[61439]: DEBUG nova.network.neutron [-] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 664.826262] env[61439]: DEBUG nova.compute.utils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 664.827562] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 664.827735] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 664.834515] env[61439]: DEBUG nova.network.neutron [-] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 664.841553] env[61439]: DEBUG nova.network.neutron [-] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 664.843643] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 664.856954] env[61439]: INFO nova.compute.manager [-] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Took 0.09 seconds to deallocate network for instance. [ 664.859358] env[61439]: DEBUG nova.compute.claims [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 664.859618] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 664.859930] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 664.934354] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 664.962683] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 664.962897] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 664.963101] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 664.963186] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 664.964242] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 664.964242] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 664.964242] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 664.964242] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 664.964242] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 664.964731] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 664.964731] env[61439]: DEBUG nova.virt.hardware [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 664.967063] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b53b4d1a-15e9-42dc-9580-f4dacc4ca290 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.979563] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6f869e4-3ec9-4334-8291-5f71d00cb364 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.081954] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93cb286b-2176-470d-a668-77cd752f1d9d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.089570] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bf7dbfa-8dee-4cd2-b6c7-233e592d8b3b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.122436] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d504aab8-bcb0-4f8e-ad52-63389311ca29 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.133705] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b807e549-f1fc-4e98-b7d6-73374d71347f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.146243] env[61439]: DEBUG nova.compute.provider_tree [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 665.155563] env[61439]: DEBUG nova.scheduler.client.report [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 665.176463] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.316s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 665.177103] env[61439]: ERROR nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Traceback (most recent call last): [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self.driver.spawn(context, instance, image_meta, [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self._vmops.spawn(context, instance, image_meta, injected_files, [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] vm_ref = self.build_virtual_machine(instance, [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] vif_infos = vmwarevif.get_vif_info(self._session, [ 665.177103] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] for vif in network_info: [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] return self._sync_wrapper(fn, *args, **kwargs) [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self.wait() [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self[:] = self._gt.wait() [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] return self._exit_event.wait() [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] result = hub.switch() [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 665.180367] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] return self.greenlet.switch() [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] result = function(*args, **kwargs) [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] return func(*args, **kwargs) [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] raise e [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] nwinfo = self.network_api.allocate_for_instance( [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] created_port_ids = self._update_ports_for_instance( [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] with excutils.save_and_reraise_exception(): [ 665.180735] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] self.force_reraise() [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] raise self.value [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] updated_port = self._update_port( [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] _ensure_no_port_binding_failure(port) [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] raise exception.PortBindingFailed(port_id=port['id']) [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] nova.exception.PortBindingFailed: Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. [ 665.181151] env[61439]: ERROR nova.compute.manager [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] [ 665.181427] env[61439]: DEBUG nova.compute.utils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 665.181427] env[61439]: DEBUG nova.policy [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '92f701ef37364c629ed2fed539cda2cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b8d93352efbf41be917afa7c0b5a1ed3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 665.182494] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Build of instance c3fd73f2-027d-42d3-ad53-4925403a9c92 was re-scheduled: Binding failed for port 6787e179-849f-4357-9cfb-6b6f599dc8c9, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 665.182949] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 665.183245] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "refresh_cache-c3fd73f2-027d-42d3-ad53-4925403a9c92" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 665.183420] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquired lock "refresh_cache-c3fd73f2-027d-42d3-ad53-4925403a9c92" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 665.183593] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 665.362955] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 665.478373] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Successfully created port: 2368fddb-20b2-40b3-b7d3-7d1fce015c95 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 666.161228] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Acquiring lock "e32ca59c-5bce-40e5-85e9-0979e673688a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.161624] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Lock "e32ca59c-5bce-40e5-85e9-0979e673688a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.181022] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 666.248071] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.248352] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.249997] env[61439]: INFO nova.compute.claims [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 666.371424] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 666.389017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Releasing lock "refresh_cache-c3fd73f2-027d-42d3-ad53-4925403a9c92" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 666.389017] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 666.389017] env[61439]: DEBUG nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 666.389017] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 666.431287] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Successfully created port: b4c6a589-d627-4d9e-b784-1847a9fab568 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 666.495156] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd3e1f97-5595-4bea-9155-f4fc394f2d24 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.503577] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8034aa4-3535-4c0c-855b-c07a91ee5e1a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.542357] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 666.545327] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e688bb80-306a-467f-8481-ac6779876fd9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.553544] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba68b2c3-5da8-4413-9682-3621bd0db9d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.559161] env[61439]: DEBUG nova.network.neutron [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 666.572443] env[61439]: DEBUG nova.compute.provider_tree [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 666.574266] env[61439]: INFO nova.compute.manager [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: c3fd73f2-027d-42d3-ad53-4925403a9c92] Took 0.19 seconds to deallocate network for instance. [ 666.581411] env[61439]: DEBUG nova.scheduler.client.report [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 666.623331] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.375s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.623862] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 666.662046] env[61439]: DEBUG nova.compute.utils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 666.663505] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 666.663560] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 666.674779] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 666.693573] env[61439]: INFO nova.scheduler.client.report [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Deleted allocations for instance c3fd73f2-027d-42d3-ad53-4925403a9c92 [ 666.719681] env[61439]: DEBUG oslo_concurrency.lockutils [None req-55fb50b8-fa11-4d57-81fc-29c0dd1f637e tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "c3fd73f2-027d-42d3-ad53-4925403a9c92" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.546s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.751506] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 666.781701] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 666.781929] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 666.782100] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 666.782346] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 666.782509] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 666.782658] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 666.782862] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 666.783248] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 666.783512] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 666.783689] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 666.783865] env[61439]: DEBUG nova.virt.hardware [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 666.784715] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7dc7758-fc98-4dc7-bb38-496d04851ef6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.794951] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab87754f-b2e4-4484-89f6-56abd688db72 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.926873] env[61439]: ERROR nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. [ 666.926873] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 666.926873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 666.926873] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 666.926873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 666.926873] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 666.926873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 666.926873] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 666.926873] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 666.926873] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 666.926873] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 666.926873] env[61439]: ERROR nova.compute.manager raise self.value [ 666.926873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 666.926873] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 666.926873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 666.926873] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 666.927658] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 666.927658] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 666.927658] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. [ 666.927658] env[61439]: ERROR nova.compute.manager [ 666.927658] env[61439]: Traceback (most recent call last): [ 666.927658] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 666.927658] env[61439]: listener.cb(fileno) [ 666.927658] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 666.927658] env[61439]: result = function(*args, **kwargs) [ 666.927658] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 666.927658] env[61439]: return func(*args, **kwargs) [ 666.927658] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 666.927658] env[61439]: raise e [ 666.927658] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 666.927658] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 666.927658] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 666.927658] env[61439]: created_port_ids = self._update_ports_for_instance( [ 666.927658] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 666.927658] env[61439]: with excutils.save_and_reraise_exception(): [ 666.927658] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 666.927658] env[61439]: self.force_reraise() [ 666.927658] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 666.927658] env[61439]: raise self.value [ 666.927658] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 666.927658] env[61439]: updated_port = self._update_port( [ 666.927658] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 666.927658] env[61439]: _ensure_no_port_binding_failure(port) [ 666.927658] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 666.927658] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 666.928390] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. [ 666.928390] env[61439]: Removing descriptor: 21 [ 666.928390] env[61439]: ERROR nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Traceback (most recent call last): [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] yield resources [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self.driver.spawn(context, instance, image_meta, [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 666.928390] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] vm_ref = self.build_virtual_machine(instance, [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] vif_infos = vmwarevif.get_vif_info(self._session, [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] for vif in network_info: [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] return self._sync_wrapper(fn, *args, **kwargs) [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self.wait() [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self[:] = self._gt.wait() [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] return self._exit_event.wait() [ 666.928696] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] result = hub.switch() [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] return self.greenlet.switch() [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] result = function(*args, **kwargs) [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] return func(*args, **kwargs) [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] raise e [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] nwinfo = self.network_api.allocate_for_instance( [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 666.929438] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] created_port_ids = self._update_ports_for_instance( [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] with excutils.save_and_reraise_exception(): [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self.force_reraise() [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] raise self.value [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] updated_port = self._update_port( [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] _ensure_no_port_binding_failure(port) [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 666.929809] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] raise exception.PortBindingFailed(port_id=port['id']) [ 666.930114] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] nova.exception.PortBindingFailed: Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. [ 666.930114] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] [ 666.930114] env[61439]: INFO nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Terminating instance [ 666.932869] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "refresh_cache-fb8aee53-11e7-4e32-9225-06671bb511d7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 666.933332] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquired lock "refresh_cache-fb8aee53-11e7-4e32-9225-06671bb511d7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 666.933332] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 667.027166] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 667.200416] env[61439]: DEBUG nova.policy [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e462b9a9c54f42efb88b614e2b31eda3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '177860dab437443984a62a6110755d3b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 667.263260] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Successfully created port: 8a72056c-1a3f-41ec-8540-27f0817eba1c {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 667.618806] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 667.629348] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Releasing lock "refresh_cache-fb8aee53-11e7-4e32-9225-06671bb511d7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 667.629762] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 667.629955] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 667.630535] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-259d453f-cd41-435f-824d-bd2b8792c521 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.642285] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c5f3d9f-c512-4d8e-aa55-1d8a6237d396 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.671966] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fb8aee53-11e7-4e32-9225-06671bb511d7 could not be found. [ 667.672396] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 667.672515] env[61439]: INFO nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 667.673070] env[61439]: DEBUG oslo.service.loopingcall [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 667.673639] env[61439]: DEBUG nova.compute.manager [-] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 667.673853] env[61439]: DEBUG nova.network.neutron [-] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 667.920794] env[61439]: DEBUG nova.network.neutron [-] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 667.929164] env[61439]: DEBUG nova.network.neutron [-] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 667.938385] env[61439]: INFO nova.compute.manager [-] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Took 0.26 seconds to deallocate network for instance. [ 667.941466] env[61439]: DEBUG nova.compute.claims [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 667.941595] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 667.943334] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 668.109949] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baf78fd7-40c1-4e25-9239-7d3bb0990c77 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.121980] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b99e0a-4e95-4475-8086-58b7599bc2d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.156972] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ffe9ad4-2aa8-4593-9c59-cfa679b3406a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.167538] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bd5e04c-ce14-4f4f-83aa-a4a1bda442f1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.182058] env[61439]: DEBUG nova.compute.provider_tree [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 668.195385] env[61439]: DEBUG nova.scheduler.client.report [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 668.216608] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.275s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 668.217247] env[61439]: ERROR nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Traceback (most recent call last): [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self.driver.spawn(context, instance, image_meta, [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] vm_ref = self.build_virtual_machine(instance, [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] vif_infos = vmwarevif.get_vif_info(self._session, [ 668.217247] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] for vif in network_info: [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] return self._sync_wrapper(fn, *args, **kwargs) [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self.wait() [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self[:] = self._gt.wait() [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] return self._exit_event.wait() [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] result = hub.switch() [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 668.217616] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] return self.greenlet.switch() [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] result = function(*args, **kwargs) [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] return func(*args, **kwargs) [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] raise e [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] nwinfo = self.network_api.allocate_for_instance( [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] created_port_ids = self._update_ports_for_instance( [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] with excutils.save_and_reraise_exception(): [ 668.218118] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] self.force_reraise() [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] raise self.value [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] updated_port = self._update_port( [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] _ensure_no_port_binding_failure(port) [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] raise exception.PortBindingFailed(port_id=port['id']) [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] nova.exception.PortBindingFailed: Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. [ 668.218918] env[61439]: ERROR nova.compute.manager [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] [ 668.219259] env[61439]: DEBUG nova.compute.utils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 668.222753] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Build of instance fb8aee53-11e7-4e32-9225-06671bb511d7 was re-scheduled: Binding failed for port 4f72723f-9f91-4066-a9a6-fe0bf74afafd, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 668.223371] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 668.223659] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "refresh_cache-fb8aee53-11e7-4e32-9225-06671bb511d7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 668.223825] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquired lock "refresh_cache-fb8aee53-11e7-4e32-9225-06671bb511d7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 668.223992] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 668.307669] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 668.793707] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 668.805176] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Releasing lock "refresh_cache-fb8aee53-11e7-4e32-9225-06671bb511d7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 668.805586] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 668.805954] env[61439]: DEBUG nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 668.805954] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 669.036743] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 669.047612] env[61439]: DEBUG nova.network.neutron [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.058136] env[61439]: INFO nova.compute.manager [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: fb8aee53-11e7-4e32-9225-06671bb511d7] Took 0.25 seconds to deallocate network for instance. [ 669.197053] env[61439]: INFO nova.scheduler.client.report [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Deleted allocations for instance fb8aee53-11e7-4e32-9225-06671bb511d7 [ 669.227372] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b07bd65-a3d3-4a63-a463-ac2e8be1c6b1 tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "fb8aee53-11e7-4e32-9225-06671bb511d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.220s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 669.323452] env[61439]: ERROR nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. [ 669.323452] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 669.323452] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 669.323452] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 669.323452] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 669.323452] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 669.323452] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 669.323452] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 669.323452] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 669.323452] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 669.323452] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 669.323452] env[61439]: ERROR nova.compute.manager raise self.value [ 669.323452] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 669.323452] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 669.323452] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 669.323452] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 669.323935] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 669.323935] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 669.323935] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. [ 669.323935] env[61439]: ERROR nova.compute.manager [ 669.323935] env[61439]: Traceback (most recent call last): [ 669.323935] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 669.323935] env[61439]: listener.cb(fileno) [ 669.323935] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 669.323935] env[61439]: result = function(*args, **kwargs) [ 669.323935] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 669.323935] env[61439]: return func(*args, **kwargs) [ 669.323935] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 669.323935] env[61439]: raise e [ 669.323935] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 669.323935] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 669.323935] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 669.323935] env[61439]: created_port_ids = self._update_ports_for_instance( [ 669.323935] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 669.323935] env[61439]: with excutils.save_and_reraise_exception(): [ 669.323935] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 669.323935] env[61439]: self.force_reraise() [ 669.323935] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 669.323935] env[61439]: raise self.value [ 669.323935] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 669.323935] env[61439]: updated_port = self._update_port( [ 669.323935] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 669.323935] env[61439]: _ensure_no_port_binding_failure(port) [ 669.323935] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 669.323935] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 669.324764] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. [ 669.324764] env[61439]: Removing descriptor: 20 [ 669.324764] env[61439]: ERROR nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Traceback (most recent call last): [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] yield resources [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self.driver.spawn(context, instance, image_meta, [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 669.324764] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] vm_ref = self.build_virtual_machine(instance, [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] vif_infos = vmwarevif.get_vif_info(self._session, [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] for vif in network_info: [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] return self._sync_wrapper(fn, *args, **kwargs) [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self.wait() [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self[:] = self._gt.wait() [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] return self._exit_event.wait() [ 669.325248] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] result = hub.switch() [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] return self.greenlet.switch() [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] result = function(*args, **kwargs) [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] return func(*args, **kwargs) [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] raise e [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] nwinfo = self.network_api.allocate_for_instance( [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 669.325597] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] created_port_ids = self._update_ports_for_instance( [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] with excutils.save_and_reraise_exception(): [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self.force_reraise() [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] raise self.value [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] updated_port = self._update_port( [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] _ensure_no_port_binding_failure(port) [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 669.325979] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] raise exception.PortBindingFailed(port_id=port['id']) [ 669.326331] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] nova.exception.PortBindingFailed: Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. [ 669.326331] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] [ 669.326331] env[61439]: INFO nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Terminating instance [ 669.328015] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Acquiring lock "refresh_cache-7c682f87-0bf4-460c-97a7-bd369d91a6a2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 669.328288] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Acquired lock "refresh_cache-7c682f87-0bf4-460c-97a7-bd369d91a6a2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 669.328532] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 669.414556] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 669.724288] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.739043] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Releasing lock "refresh_cache-7c682f87-0bf4-460c-97a7-bd369d91a6a2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 669.739985] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 669.740537] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 669.741306] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b501c90f-c606-482b-8b99-321308ce8f93 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.755901] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5bcaafa-905d-4a6c-932f-6a91c455dfbf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.787964] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7c682f87-0bf4-460c-97a7-bd369d91a6a2 could not be found. [ 669.788961] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 669.788961] env[61439]: INFO nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Took 0.05 seconds to destroy the instance on the hypervisor. [ 669.788961] env[61439]: DEBUG oslo.service.loopingcall [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 669.789328] env[61439]: DEBUG nova.compute.manager [-] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 669.789478] env[61439]: DEBUG nova.network.neutron [-] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 669.825308] env[61439]: DEBUG nova.network.neutron [-] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 669.838120] env[61439]: DEBUG nova.network.neutron [-] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.851791] env[61439]: INFO nova.compute.manager [-] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Took 0.06 seconds to deallocate network for instance. [ 669.857592] env[61439]: DEBUG nova.compute.claims [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 669.857808] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 669.858106] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.039945] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2634c92-9fdc-4fa0-ad49-aceaa6e7e3cc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.052226] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b846d7c-eff5-4905-94bb-c2c2edfe12c0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.086283] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd7fe6da-af0e-4280-8d37-c3ba5911fffe {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.096605] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8741e967-2205-4ed9-aa4e-4a2120b42bf5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.116645] env[61439]: DEBUG nova.compute.provider_tree [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 670.128203] env[61439]: DEBUG nova.scheduler.client.report [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 670.152332] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.294s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 670.153043] env[61439]: ERROR nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Traceback (most recent call last): [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self.driver.spawn(context, instance, image_meta, [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] vm_ref = self.build_virtual_machine(instance, [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] vif_infos = vmwarevif.get_vif_info(self._session, [ 670.153043] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] for vif in network_info: [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] return self._sync_wrapper(fn, *args, **kwargs) [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self.wait() [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self[:] = self._gt.wait() [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] return self._exit_event.wait() [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] result = hub.switch() [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 670.153466] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] return self.greenlet.switch() [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] result = function(*args, **kwargs) [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] return func(*args, **kwargs) [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] raise e [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] nwinfo = self.network_api.allocate_for_instance( [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] created_port_ids = self._update_ports_for_instance( [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] with excutils.save_and_reraise_exception(): [ 670.153884] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] self.force_reraise() [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] raise self.value [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] updated_port = self._update_port( [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] _ensure_no_port_binding_failure(port) [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] raise exception.PortBindingFailed(port_id=port['id']) [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] nova.exception.PortBindingFailed: Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. [ 670.154207] env[61439]: ERROR nova.compute.manager [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] [ 670.154473] env[61439]: DEBUG nova.compute.utils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 670.155999] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Build of instance 7c682f87-0bf4-460c-97a7-bd369d91a6a2 was re-scheduled: Binding failed for port 9b109a6b-fa18-4dfc-9ac9-4beedf6be738, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 670.159610] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 670.159610] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Acquiring lock "refresh_cache-7c682f87-0bf4-460c-97a7-bd369d91a6a2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 670.159610] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Acquired lock "refresh_cache-7c682f87-0bf4-460c-97a7-bd369d91a6a2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 670.159610] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 670.211693] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 670.361849] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Successfully created port: 8a86f436-b6f2-45a2-872f-ea805c774367 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 670.664747] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 670.688200] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Releasing lock "refresh_cache-7c682f87-0bf4-460c-97a7-bd369d91a6a2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 670.688457] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 670.688639] env[61439]: DEBUG nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 670.688803] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 670.739809] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "daf5a018-8f3c-4aa3-bff6-d185c9f125b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.740060] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "daf5a018-8f3c-4aa3-bff6-d185c9f125b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.759939] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 670.820441] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 670.832681] env[61439]: DEBUG nova.network.neutron [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 670.838647] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.838945] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.840768] env[61439]: INFO nova.compute.claims [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 670.850202] env[61439]: INFO nova.compute.manager [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] [instance: 7c682f87-0bf4-460c-97a7-bd369d91a6a2] Took 0.16 seconds to deallocate network for instance. [ 671.009281] env[61439]: INFO nova.scheduler.client.report [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Deleted allocations for instance 7c682f87-0bf4-460c-97a7-bd369d91a6a2 [ 671.046723] env[61439]: DEBUG oslo_concurrency.lockutils [None req-94cae6c2-bef0-4f51-b7c0-29c244366628 tempest-ServerMetadataTestJSON-618169477 tempest-ServerMetadataTestJSON-618169477-project-member] Lock "7c682f87-0bf4-460c-97a7-bd369d91a6a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.310s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 671.108249] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dc3f669-8892-4b78-85c9-25ae95bc0bd7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.119437] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f7a8570-5df1-490d-81ef-ebb5211b10f7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.154944] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eb7e9d7-36e2-48a0-92ce-e4a71139501e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.163241] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c8f07e8-02e2-4a31-82ef-d098618526a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.184907] env[61439]: DEBUG nova.compute.provider_tree [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 671.194585] env[61439]: DEBUG nova.scheduler.client.report [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 671.214754] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 671.215670] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 671.259314] env[61439]: DEBUG nova.compute.utils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 671.260840] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 671.261096] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 671.276510] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 671.357301] env[61439]: ERROR nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. [ 671.357301] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 671.357301] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 671.357301] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 671.357301] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 671.357301] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 671.357301] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 671.357301] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 671.357301] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 671.357301] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 671.357301] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 671.357301] env[61439]: ERROR nova.compute.manager raise self.value [ 671.357301] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 671.357301] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 671.357301] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 671.357301] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 671.358414] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 671.358414] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 671.358414] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. [ 671.358414] env[61439]: ERROR nova.compute.manager [ 671.358414] env[61439]: Traceback (most recent call last): [ 671.358414] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 671.358414] env[61439]: listener.cb(fileno) [ 671.358414] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 671.358414] env[61439]: result = function(*args, **kwargs) [ 671.358414] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 671.358414] env[61439]: return func(*args, **kwargs) [ 671.358414] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 671.358414] env[61439]: raise e [ 671.358414] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 671.358414] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 671.358414] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 671.358414] env[61439]: created_port_ids = self._update_ports_for_instance( [ 671.358414] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 671.358414] env[61439]: with excutils.save_and_reraise_exception(): [ 671.358414] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 671.358414] env[61439]: self.force_reraise() [ 671.358414] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 671.358414] env[61439]: raise self.value [ 671.358414] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 671.358414] env[61439]: updated_port = self._update_port( [ 671.358414] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 671.358414] env[61439]: _ensure_no_port_binding_failure(port) [ 671.358414] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 671.358414] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 671.359157] env[61439]: nova.exception.PortBindingFailed: Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. [ 671.359157] env[61439]: Removing descriptor: 18 [ 671.359157] env[61439]: ERROR nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Traceback (most recent call last): [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] yield resources [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self.driver.spawn(context, instance, image_meta, [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self._vmops.spawn(context, instance, image_meta, injected_files, [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 671.359157] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] vm_ref = self.build_virtual_machine(instance, [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] vif_infos = vmwarevif.get_vif_info(self._session, [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] for vif in network_info: [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] return self._sync_wrapper(fn, *args, **kwargs) [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self.wait() [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self[:] = self._gt.wait() [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] return self._exit_event.wait() [ 671.359488] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] result = hub.switch() [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] return self.greenlet.switch() [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] result = function(*args, **kwargs) [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] return func(*args, **kwargs) [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] raise e [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] nwinfo = self.network_api.allocate_for_instance( [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 671.359827] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] created_port_ids = self._update_ports_for_instance( [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] with excutils.save_and_reraise_exception(): [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self.force_reraise() [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] raise self.value [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] updated_port = self._update_port( [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] _ensure_no_port_binding_failure(port) [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 671.360177] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] raise exception.PortBindingFailed(port_id=port['id']) [ 671.360494] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] nova.exception.PortBindingFailed: Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. [ 671.360494] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] [ 671.360494] env[61439]: INFO nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Terminating instance [ 671.360903] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Acquiring lock "refresh_cache-4bce210a-e6bc-4c67-ad48-397422291467" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 671.360903] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Acquired lock "refresh_cache-4bce210a-e6bc-4c67-ad48-397422291467" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 671.361021] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 671.392909] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 671.422908] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 671.422908] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 671.422908] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 671.423188] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 671.423188] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 671.423188] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 671.423188] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 671.423714] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 671.423714] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 671.423714] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 671.423878] env[61439]: DEBUG nova.virt.hardware [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 671.424674] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0204a7d4-c6a4-4e49-8a12-48d66d878e6d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.434196] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb672f93-b73d-49dd-8acd-80302a1f9f6d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.494434] env[61439]: DEBUG nova.policy [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '564a426ea7b54c38a2bb07a08fc935be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00123cdebdcf4b01ad920d353fcacff4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 671.665283] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 671.812116] env[61439]: ERROR nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. [ 671.812116] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 671.812116] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 671.812116] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 671.812116] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 671.812116] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 671.812116] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 671.812116] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 671.812116] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 671.812116] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 671.812116] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 671.812116] env[61439]: ERROR nova.compute.manager raise self.value [ 671.812116] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 671.812116] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 671.812116] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 671.812116] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 671.812677] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 671.812677] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 671.812677] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. [ 671.812677] env[61439]: ERROR nova.compute.manager [ 671.812677] env[61439]: Traceback (most recent call last): [ 671.812677] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 671.812677] env[61439]: listener.cb(fileno) [ 671.812677] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 671.812677] env[61439]: result = function(*args, **kwargs) [ 671.812677] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 671.812677] env[61439]: return func(*args, **kwargs) [ 671.812677] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 671.812677] env[61439]: raise e [ 671.812677] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 671.812677] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 671.812677] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 671.812677] env[61439]: created_port_ids = self._update_ports_for_instance( [ 671.812677] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 671.812677] env[61439]: with excutils.save_and_reraise_exception(): [ 671.812677] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 671.812677] env[61439]: self.force_reraise() [ 671.812677] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 671.812677] env[61439]: raise self.value [ 671.812677] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 671.812677] env[61439]: updated_port = self._update_port( [ 671.812677] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 671.812677] env[61439]: _ensure_no_port_binding_failure(port) [ 671.812677] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 671.812677] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 671.813499] env[61439]: nova.exception.PortBindingFailed: Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. [ 671.813499] env[61439]: Removing descriptor: 23 [ 671.813499] env[61439]: ERROR nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Traceback (most recent call last): [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] yield resources [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self.driver.spawn(context, instance, image_meta, [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self._vmops.spawn(context, instance, image_meta, injected_files, [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 671.813499] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] vm_ref = self.build_virtual_machine(instance, [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] vif_infos = vmwarevif.get_vif_info(self._session, [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] for vif in network_info: [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] return self._sync_wrapper(fn, *args, **kwargs) [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self.wait() [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self[:] = self._gt.wait() [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] return self._exit_event.wait() [ 671.813829] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] result = hub.switch() [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] return self.greenlet.switch() [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] result = function(*args, **kwargs) [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] return func(*args, **kwargs) [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] raise e [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] nwinfo = self.network_api.allocate_for_instance( [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 671.814200] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] created_port_ids = self._update_ports_for_instance( [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] with excutils.save_and_reraise_exception(): [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self.force_reraise() [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] raise self.value [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] updated_port = self._update_port( [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] _ensure_no_port_binding_failure(port) [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 671.814550] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] raise exception.PortBindingFailed(port_id=port['id']) [ 671.814863] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] nova.exception.PortBindingFailed: Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. [ 671.814863] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] [ 671.814863] env[61439]: INFO nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Terminating instance [ 671.817640] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "refresh_cache-05067c00-8595-412c-ad68-095cba5bf6da" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 671.817919] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquired lock "refresh_cache-05067c00-8595-412c-ad68-095cba5bf6da" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 671.818132] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 671.871131] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.094648] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.104257] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Releasing lock "refresh_cache-4bce210a-e6bc-4c67-ad48-397422291467" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 672.104695] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 672.104894] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 672.105485] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3bfdef46-ba7f-4d8c-a9ee-0af2e7c37743 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.119505] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-622c1db2-cb11-48c0-baf4-4c4eb0220323 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.145257] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4bce210a-e6bc-4c67-ad48-397422291467 could not be found. [ 672.145493] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 672.145679] env[61439]: INFO nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Took 0.04 seconds to destroy the instance on the hypervisor. [ 672.145935] env[61439]: DEBUG oslo.service.loopingcall [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 672.146171] env[61439]: DEBUG nova.compute.manager [-] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 672.146270] env[61439]: DEBUG nova.network.neutron [-] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 672.238731] env[61439]: DEBUG nova.network.neutron [-] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.249298] env[61439]: DEBUG nova.network.neutron [-] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.261569] env[61439]: INFO nova.compute.manager [-] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Took 0.12 seconds to deallocate network for instance. [ 672.263651] env[61439]: DEBUG nova.compute.claims [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 672.263869] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.265994] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.317413] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.339746] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Releasing lock "refresh_cache-05067c00-8595-412c-ad68-095cba5bf6da" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 672.340180] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 672.340376] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 672.340891] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-187f5ce6-615e-407b-9ab6-cc581f7e4e03 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.356089] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f5ff188-4dfd-4683-8b2d-5cabff102c7b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.398288] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 05067c00-8595-412c-ad68-095cba5bf6da could not be found. [ 672.398607] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 672.398724] env[61439]: INFO nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Took 0.06 seconds to destroy the instance on the hypervisor. [ 672.398973] env[61439]: DEBUG oslo.service.loopingcall [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 672.403232] env[61439]: DEBUG nova.compute.manager [-] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 672.403380] env[61439]: DEBUG nova.network.neutron [-] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 672.454582] env[61439]: DEBUG nova.network.neutron [-] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.461030] env[61439]: ERROR nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. [ 672.461030] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 672.461030] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 672.461030] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 672.461030] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 672.461030] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 672.461030] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 672.461030] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 672.461030] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 672.461030] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 672.461030] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 672.461030] env[61439]: ERROR nova.compute.manager raise self.value [ 672.461030] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 672.461030] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 672.461030] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 672.461030] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 672.461504] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 672.461504] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 672.461504] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. [ 672.461504] env[61439]: ERROR nova.compute.manager [ 672.461504] env[61439]: Traceback (most recent call last): [ 672.461504] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 672.461504] env[61439]: listener.cb(fileno) [ 672.461504] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 672.461504] env[61439]: result = function(*args, **kwargs) [ 672.461504] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 672.461504] env[61439]: return func(*args, **kwargs) [ 672.461504] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 672.461504] env[61439]: raise e [ 672.461504] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 672.461504] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 672.461504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 672.461504] env[61439]: created_port_ids = self._update_ports_for_instance( [ 672.461504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 672.461504] env[61439]: with excutils.save_and_reraise_exception(): [ 672.461504] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 672.461504] env[61439]: self.force_reraise() [ 672.461504] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 672.461504] env[61439]: raise self.value [ 672.461504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 672.461504] env[61439]: updated_port = self._update_port( [ 672.461504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 672.461504] env[61439]: _ensure_no_port_binding_failure(port) [ 672.461504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 672.461504] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 672.462402] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. [ 672.462402] env[61439]: Removing descriptor: 24 [ 672.462402] env[61439]: ERROR nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Traceback (most recent call last): [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] yield resources [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self.driver.spawn(context, instance, image_meta, [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 672.462402] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] vm_ref = self.build_virtual_machine(instance, [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] vif_infos = vmwarevif.get_vif_info(self._session, [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] for vif in network_info: [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] return self._sync_wrapper(fn, *args, **kwargs) [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self.wait() [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self[:] = self._gt.wait() [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] return self._exit_event.wait() [ 672.462938] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] result = hub.switch() [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] return self.greenlet.switch() [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] result = function(*args, **kwargs) [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] return func(*args, **kwargs) [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] raise e [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] nwinfo = self.network_api.allocate_for_instance( [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 672.463352] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] created_port_ids = self._update_ports_for_instance( [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] with excutils.save_and_reraise_exception(): [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self.force_reraise() [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] raise self.value [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] updated_port = self._update_port( [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] _ensure_no_port_binding_failure(port) [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 672.463680] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] raise exception.PortBindingFailed(port_id=port['id']) [ 672.463994] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] nova.exception.PortBindingFailed: Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. [ 672.463994] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] [ 672.463994] env[61439]: INFO nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Terminating instance [ 672.468319] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "refresh_cache-605eecc4-0fff-44bc-9d79-799c4762707a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 672.468527] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquired lock "refresh_cache-605eecc4-0fff-44bc-9d79-799c4762707a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 672.468721] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 672.469695] env[61439]: DEBUG nova.network.neutron [-] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.483928] env[61439]: INFO nova.compute.manager [-] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Took 0.08 seconds to deallocate network for instance. [ 672.486618] env[61439]: DEBUG nova.compute.claims [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 672.488373] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.488606] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3373fbf5-4212-49b2-80fc-e64ef1b2562e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.497618] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f2a2ca7-39ea-4778-b93b-d66881856779 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.538543] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.540841] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b77e6d48-7a4f-4e18-9077-e8d3c28f91cb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.550785] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6352c364-c074-4677-ba7e-b216154208b7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.568633] env[61439]: DEBUG nova.compute.provider_tree [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 672.584642] env[61439]: DEBUG nova.scheduler.client.report [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 672.604594] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.340s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 672.605515] env[61439]: ERROR nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Traceback (most recent call last): [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self.driver.spawn(context, instance, image_meta, [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self._vmops.spawn(context, instance, image_meta, injected_files, [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] vm_ref = self.build_virtual_machine(instance, [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] vif_infos = vmwarevif.get_vif_info(self._session, [ 672.605515] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] for vif in network_info: [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] return self._sync_wrapper(fn, *args, **kwargs) [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self.wait() [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self[:] = self._gt.wait() [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] return self._exit_event.wait() [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] result = hub.switch() [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 672.605868] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] return self.greenlet.switch() [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] result = function(*args, **kwargs) [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] return func(*args, **kwargs) [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] raise e [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] nwinfo = self.network_api.allocate_for_instance( [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] created_port_ids = self._update_ports_for_instance( [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] with excutils.save_and_reraise_exception(): [ 672.606241] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] self.force_reraise() [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] raise self.value [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] updated_port = self._update_port( [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] _ensure_no_port_binding_failure(port) [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] raise exception.PortBindingFailed(port_id=port['id']) [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] nova.exception.PortBindingFailed: Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. [ 672.606646] env[61439]: ERROR nova.compute.manager [instance: 4bce210a-e6bc-4c67-ad48-397422291467] [ 672.606936] env[61439]: DEBUG nova.compute.utils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 672.607954] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.121s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.610653] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Build of instance 4bce210a-e6bc-4c67-ad48-397422291467 was re-scheduled: Binding failed for port dc8ac46f-f616-4f92-ad64-f212950b7634, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 672.611112] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 672.611434] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Acquiring lock "refresh_cache-4bce210a-e6bc-4c67-ad48-397422291467" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 672.611609] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Acquired lock "refresh_cache-4bce210a-e6bc-4c67-ad48-397422291467" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 672.611954] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 672.619823] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Successfully created port: bae68af0-39c4-46fb-80c7-7a45da13e70a {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 672.673673] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.712705] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.725315] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Releasing lock "refresh_cache-605eecc4-0fff-44bc-9d79-799c4762707a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 672.725315] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 672.725315] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 672.725610] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ded7aa0c-54d3-46fc-a15e-a58ac05610d1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.740952] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a056562e-e7fb-4dda-8528-14c16e05f341 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.772482] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 605eecc4-0fff-44bc-9d79-799c4762707a could not be found. [ 672.772756] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 672.773803] env[61439]: INFO nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 672.774244] env[61439]: DEBUG oslo.service.loopingcall [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 672.779091] env[61439]: DEBUG nova.compute.manager [-] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 672.779091] env[61439]: DEBUG nova.network.neutron [-] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 672.819593] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72cdc8ef-ded8-41f2-acac-cb33ddd91c13 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.823794] env[61439]: DEBUG nova.network.neutron [-] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.830385] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6071d748-0d89-4b74-9b01-6b1aa3e309d3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.835388] env[61439]: DEBUG nova.network.neutron [-] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.840142] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.869569] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Releasing lock "refresh_cache-4bce210a-e6bc-4c67-ad48-397422291467" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 672.869819] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 672.869965] env[61439]: DEBUG nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 672.870159] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 672.872088] env[61439]: INFO nova.compute.manager [-] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Took 0.09 seconds to deallocate network for instance. [ 672.873187] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dfb608c-74a4-4f66-a88d-eb371961a459 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.877460] env[61439]: DEBUG nova.compute.claims [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 672.877634] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.883784] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4382037-f0d0-49ed-bec1-20f351aca18c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.898438] env[61439]: DEBUG nova.compute.provider_tree [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 672.907148] env[61439]: DEBUG nova.scheduler.client.report [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 672.923439] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.930225] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.322s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 672.930970] env[61439]: ERROR nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Traceback (most recent call last): [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self.driver.spawn(context, instance, image_meta, [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self._vmops.spawn(context, instance, image_meta, injected_files, [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] vm_ref = self.build_virtual_machine(instance, [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] vif_infos = vmwarevif.get_vif_info(self._session, [ 672.930970] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] for vif in network_info: [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] return self._sync_wrapper(fn, *args, **kwargs) [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self.wait() [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self[:] = self._gt.wait() [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] return self._exit_event.wait() [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] result = hub.switch() [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 672.931377] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] return self.greenlet.switch() [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] result = function(*args, **kwargs) [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] return func(*args, **kwargs) [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] raise e [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] nwinfo = self.network_api.allocate_for_instance( [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] created_port_ids = self._update_ports_for_instance( [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] with excutils.save_and_reraise_exception(): [ 672.931693] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] self.force_reraise() [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] raise self.value [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] updated_port = self._update_port( [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] _ensure_no_port_binding_failure(port) [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] raise exception.PortBindingFailed(port_id=port['id']) [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] nova.exception.PortBindingFailed: Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. [ 672.933941] env[61439]: ERROR nova.compute.manager [instance: 05067c00-8595-412c-ad68-095cba5bf6da] [ 672.934251] env[61439]: DEBUG nova.compute.utils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 672.934251] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.055s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.937274] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Build of instance 05067c00-8595-412c-ad68-095cba5bf6da was re-scheduled: Binding failed for port b4c6a589-d627-4d9e-b784-1847a9fab568, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 672.937525] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 672.937841] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "refresh_cache-05067c00-8595-412c-ad68-095cba5bf6da" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 672.938034] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquired lock "refresh_cache-05067c00-8595-412c-ad68-095cba5bf6da" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 672.938208] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 672.943019] env[61439]: DEBUG nova.network.neutron [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.962067] env[61439]: INFO nova.compute.manager [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] [instance: 4bce210a-e6bc-4c67-ad48-397422291467] Took 0.09 seconds to deallocate network for instance. [ 672.986500] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 673.080695] env[61439]: INFO nova.scheduler.client.report [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Deleted allocations for instance 4bce210a-e6bc-4c67-ad48-397422291467 [ 673.107014] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f2c19f30-b06b-494a-855f-84876c33f6d0 tempest-ServersAdminNegativeTestJSON-266339737 tempest-ServersAdminNegativeTestJSON-266339737-project-member] Lock "4bce210a-e6bc-4c67-ad48-397422291467" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.381s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.144353] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8ea41cb-8118-4e0c-b150-2b76ec14458a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.153180] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d713a280-1b77-4c1a-b7f3-1a3a197bf7d3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.189645] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e75b82-7ee6-4fef-8003-ec7c3bfd98b4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.198985] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b21f2ce2-f814-45a8-80b2-14cf21ab066c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.215492] env[61439]: DEBUG nova.compute.provider_tree [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 673.225419] env[61439]: DEBUG nova.scheduler.client.report [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 673.243470] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.311s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.244120] env[61439]: ERROR nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Traceback (most recent call last): [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self.driver.spawn(context, instance, image_meta, [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] vm_ref = self.build_virtual_machine(instance, [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] vif_infos = vmwarevif.get_vif_info(self._session, [ 673.244120] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] for vif in network_info: [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] return self._sync_wrapper(fn, *args, **kwargs) [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self.wait() [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self[:] = self._gt.wait() [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] return self._exit_event.wait() [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] result = hub.switch() [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 673.244487] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] return self.greenlet.switch() [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] result = function(*args, **kwargs) [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] return func(*args, **kwargs) [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] raise e [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] nwinfo = self.network_api.allocate_for_instance( [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] created_port_ids = self._update_ports_for_instance( [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] with excutils.save_and_reraise_exception(): [ 673.244865] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] self.force_reraise() [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] raise self.value [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] updated_port = self._update_port( [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] _ensure_no_port_binding_failure(port) [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] raise exception.PortBindingFailed(port_id=port['id']) [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] nova.exception.PortBindingFailed: Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. [ 673.245386] env[61439]: ERROR nova.compute.manager [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] [ 673.245777] env[61439]: DEBUG nova.compute.utils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 673.247539] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Build of instance 605eecc4-0fff-44bc-9d79-799c4762707a was re-scheduled: Binding failed for port 8a72056c-1a3f-41ec-8540-27f0817eba1c, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 673.247706] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 673.247807] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquiring lock "refresh_cache-605eecc4-0fff-44bc-9d79-799c4762707a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 673.247953] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Acquired lock "refresh_cache-605eecc4-0fff-44bc-9d79-799c4762707a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 673.248437] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 673.434596] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 673.486401] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.506915] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Releasing lock "refresh_cache-05067c00-8595-412c-ad68-095cba5bf6da" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 673.507193] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 673.507349] env[61439]: DEBUG nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 673.507614] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 673.564147] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 673.575616] env[61439]: DEBUG nova.network.neutron [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.592373] env[61439]: INFO nova.compute.manager [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: 05067c00-8595-412c-ad68-095cba5bf6da] Took 0.08 seconds to deallocate network for instance. [ 673.621602] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.638123] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Releasing lock "refresh_cache-605eecc4-0fff-44bc-9d79-799c4762707a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 673.638379] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 673.638528] env[61439]: DEBUG nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 673.638692] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 673.677938] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 673.686316] env[61439]: DEBUG nova.network.neutron [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.702859] env[61439]: INFO nova.compute.manager [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] [instance: 605eecc4-0fff-44bc-9d79-799c4762707a] Took 0.06 seconds to deallocate network for instance. [ 673.734838] env[61439]: INFO nova.scheduler.client.report [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Deleted allocations for instance 05067c00-8595-412c-ad68-095cba5bf6da [ 673.773139] env[61439]: DEBUG oslo_concurrency.lockutils [None req-6a5403d9-affe-45b5-a051-50d6829a768a tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "05067c00-8595-412c-ad68-095cba5bf6da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.023s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.852300] env[61439]: INFO nova.scheduler.client.report [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Deleted allocations for instance 605eecc4-0fff-44bc-9d79-799c4762707a [ 673.875846] env[61439]: DEBUG oslo_concurrency.lockutils [None req-77e04db6-3318-4c6e-a19b-9e54ea667fc7 tempest-DeleteServersAdminTestJSON-2134891279 tempest-DeleteServersAdminTestJSON-2134891279-project-member] Lock "605eecc4-0fff-44bc-9d79-799c4762707a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.589s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 674.603957] env[61439]: ERROR nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. [ 674.603957] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 674.603957] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 674.603957] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 674.603957] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 674.603957] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 674.603957] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 674.603957] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 674.603957] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 674.603957] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 674.603957] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 674.603957] env[61439]: ERROR nova.compute.manager raise self.value [ 674.603957] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 674.603957] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 674.603957] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 674.603957] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 674.604688] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 674.604688] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 674.604688] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. [ 674.604688] env[61439]: ERROR nova.compute.manager [ 674.604688] env[61439]: Traceback (most recent call last): [ 674.604688] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 674.604688] env[61439]: listener.cb(fileno) [ 674.604688] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 674.604688] env[61439]: result = function(*args, **kwargs) [ 674.604688] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 674.604688] env[61439]: return func(*args, **kwargs) [ 674.604688] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 674.604688] env[61439]: raise e [ 674.604688] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 674.604688] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 674.604688] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 674.604688] env[61439]: created_port_ids = self._update_ports_for_instance( [ 674.604688] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 674.604688] env[61439]: with excutils.save_and_reraise_exception(): [ 674.604688] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 674.604688] env[61439]: self.force_reraise() [ 674.604688] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 674.604688] env[61439]: raise self.value [ 674.604688] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 674.604688] env[61439]: updated_port = self._update_port( [ 674.604688] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 674.604688] env[61439]: _ensure_no_port_binding_failure(port) [ 674.604688] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 674.604688] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 674.605450] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. [ 674.605450] env[61439]: Removing descriptor: 10 [ 674.605450] env[61439]: ERROR nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Traceback (most recent call last): [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] yield resources [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self.driver.spawn(context, instance, image_meta, [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 674.605450] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] vm_ref = self.build_virtual_machine(instance, [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] vif_infos = vmwarevif.get_vif_info(self._session, [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] for vif in network_info: [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] return self._sync_wrapper(fn, *args, **kwargs) [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self.wait() [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self[:] = self._gt.wait() [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] return self._exit_event.wait() [ 674.605768] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] result = hub.switch() [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] return self.greenlet.switch() [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] result = function(*args, **kwargs) [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] return func(*args, **kwargs) [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] raise e [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] nwinfo = self.network_api.allocate_for_instance( [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 674.606342] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] created_port_ids = self._update_ports_for_instance( [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] with excutils.save_and_reraise_exception(): [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self.force_reraise() [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] raise self.value [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] updated_port = self._update_port( [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] _ensure_no_port_binding_failure(port) [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 674.606696] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] raise exception.PortBindingFailed(port_id=port['id']) [ 674.607105] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] nova.exception.PortBindingFailed: Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. [ 674.607105] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] [ 674.607105] env[61439]: INFO nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Terminating instance [ 674.608820] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "refresh_cache-edb47400-6749-4235-bbb1-ffa648f3dba5" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 674.608988] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquired lock "refresh_cache-edb47400-6749-4235-bbb1-ffa648f3dba5" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 674.609238] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 674.691267] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 675.095965] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 675.113056] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Releasing lock "refresh_cache-edb47400-6749-4235-bbb1-ffa648f3dba5" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 675.113056] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 675.113056] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 675.113056] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f459cea7-ae06-43eb-8cfe-0b9a719f4f0a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.125113] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9088090-4b28-436a-89f6-7e5c76440600 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.154234] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance edb47400-6749-4235-bbb1-ffa648f3dba5 could not be found. [ 675.154791] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 675.155147] env[61439]: INFO nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 675.155545] env[61439]: DEBUG oslo.service.loopingcall [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 675.158332] env[61439]: DEBUG nova.compute.manager [-] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 675.158332] env[61439]: DEBUG nova.network.neutron [-] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 675.206929] env[61439]: DEBUG nova.network.neutron [-] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 675.216279] env[61439]: DEBUG nova.network.neutron [-] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 675.230751] env[61439]: INFO nova.compute.manager [-] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Took 0.07 seconds to deallocate network for instance. [ 675.231641] env[61439]: DEBUG nova.compute.claims [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 675.232240] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 675.233107] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 675.411690] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da07309c-a23d-49dd-bb9f-8b68d818446f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.429035] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Acquiring lock "4998340d-3afc-4fc7-bc2a-57913b534d36" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 675.429035] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Lock "4998340d-3afc-4fc7-bc2a-57913b534d36" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 675.434400] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-553dbc9e-463b-4b7a-a3f6-37bf4371a2a0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.471493] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8036b7f4-6a63-4202-8842-5c183054ec57 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.475260] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 675.487020] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7495b20e-b3d2-48c5-8349-809a69b1b44c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.503656] env[61439]: DEBUG nova.compute.provider_tree [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 675.515880] env[61439]: DEBUG nova.scheduler.client.report [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 675.538091] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.305s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 675.539094] env[61439]: ERROR nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Traceback (most recent call last): [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self.driver.spawn(context, instance, image_meta, [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] vm_ref = self.build_virtual_machine(instance, [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] vif_infos = vmwarevif.get_vif_info(self._session, [ 675.539094] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] for vif in network_info: [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] return self._sync_wrapper(fn, *args, **kwargs) [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self.wait() [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self[:] = self._gt.wait() [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] return self._exit_event.wait() [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] result = hub.switch() [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 675.539442] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] return self.greenlet.switch() [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] result = function(*args, **kwargs) [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] return func(*args, **kwargs) [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] raise e [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] nwinfo = self.network_api.allocate_for_instance( [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] created_port_ids = self._update_ports_for_instance( [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] with excutils.save_and_reraise_exception(): [ 675.539792] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] self.force_reraise() [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] raise self.value [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] updated_port = self._update_port( [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] _ensure_no_port_binding_failure(port) [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] raise exception.PortBindingFailed(port_id=port['id']) [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] nova.exception.PortBindingFailed: Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. [ 675.540152] env[61439]: ERROR nova.compute.manager [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] [ 675.540722] env[61439]: DEBUG nova.compute.utils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 675.543482] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Build of instance edb47400-6749-4235-bbb1-ffa648f3dba5 was re-scheduled: Binding failed for port 2368fddb-20b2-40b3-b7d3-7d1fce015c95, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 675.544161] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 675.544514] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "refresh_cache-edb47400-6749-4235-bbb1-ffa648f3dba5" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 675.544983] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquired lock "refresh_cache-edb47400-6749-4235-bbb1-ffa648f3dba5" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 675.545422] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 675.557638] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 675.558200] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 675.559996] env[61439]: INFO nova.compute.claims [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 675.616769] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 675.712043] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c4b517d-b311-4df7-938b-04e6d4d52fb0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.720946] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c85737-4799-44fa-aa82-6222cbdb279b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.754967] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bdfb4a8-bed7-4c8e-a2de-268d2df346d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.765015] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d63b9fc-673b-4116-8d82-3be54e4fc826 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.783823] env[61439]: DEBUG nova.compute.provider_tree [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 675.795313] env[61439]: DEBUG nova.scheduler.client.report [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 675.823221] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 675.823801] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 675.886048] env[61439]: DEBUG nova.compute.utils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 675.890435] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 675.890651] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 675.903320] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 675.952468] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 675.982366] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Releasing lock "refresh_cache-edb47400-6749-4235-bbb1-ffa648f3dba5" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 675.982607] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 675.982792] env[61439]: DEBUG nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 675.983033] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 676.012987] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 676.025220] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 676.039393] env[61439]: DEBUG nova.network.neutron [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 676.046023] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 676.046288] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 676.046452] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 676.047074] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 676.047074] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 676.047074] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 676.048999] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 676.048999] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 676.049197] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 676.049261] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 676.049501] env[61439]: DEBUG nova.virt.hardware [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 676.053647] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a16054ad-70ae-435a-a0c0-088ad2f59c03 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.057138] env[61439]: INFO nova.compute.manager [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: edb47400-6749-4235-bbb1-ffa648f3dba5] Took 0.07 seconds to deallocate network for instance. [ 676.068393] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2af5439-6580-4cca-9574-e3e201716dbd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.204297] env[61439]: INFO nova.scheduler.client.report [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Deleted allocations for instance edb47400-6749-4235-bbb1-ffa648f3dba5 [ 676.228019] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9006b775-fce8-4ed2-a697-d3002b236004 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "edb47400-6749-4235-bbb1-ffa648f3dba5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.691s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 676.248606] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Acquiring lock "f66615e9-2e8c-43d9-be73-154398e26934" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 676.248775] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Lock "f66615e9-2e8c-43d9-be73-154398e26934" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 676.264963] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 676.330688] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 676.331096] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 676.332557] env[61439]: INFO nova.compute.claims [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 676.339542] env[61439]: DEBUG nova.policy [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '15f09ca9b4fc4a6f9100a73dac11dcc2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3511963fac0e4f2db836bdd43d779946', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 676.534367] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e812e7d-5682-4702-9705-bbcb6d12d24b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.543025] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4d57621-0600-44cd-bd64-c1941fbd94f8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.577131] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91a356a4-771d-4290-9562-7ad2c7c4b18c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.586570] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ca43c3-36f9-4c8d-8e0c-e19275080f13 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.603431] env[61439]: DEBUG nova.compute.provider_tree [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 676.614670] env[61439]: DEBUG nova.scheduler.client.report [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 676.638119] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 676.638422] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 676.685076] env[61439]: DEBUG nova.compute.utils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 676.689619] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 676.689619] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 676.697024] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 676.774157] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 676.807576] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 676.807576] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 676.807576] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 676.807809] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 676.807809] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 676.807809] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 676.808257] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 676.808464] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 676.808669] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 676.809056] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 676.809293] env[61439]: DEBUG nova.virt.hardware [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 676.812286] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00d9e31f-193e-4559-835d-df4175cfc09d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.822543] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4bd8a8e-fd3c-4b6d-91c1-9f2a0e736fd4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.966356] env[61439]: DEBUG nova.policy [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1b212297e564783987dc59263fc50c8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24fe9a8f68c746dea82c0ebcd5408e85', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 677.286669] env[61439]: ERROR nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. [ 677.286669] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 677.286669] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 677.286669] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 677.286669] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 677.286669] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 677.286669] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 677.286669] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 677.286669] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 677.286669] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 677.286669] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 677.286669] env[61439]: ERROR nova.compute.manager raise self.value [ 677.286669] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 677.286669] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 677.286669] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 677.286669] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 677.287164] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 677.287164] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 677.287164] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. [ 677.287164] env[61439]: ERROR nova.compute.manager [ 677.287654] env[61439]: Traceback (most recent call last): [ 677.287715] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 677.287715] env[61439]: listener.cb(fileno) [ 677.287715] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 677.287715] env[61439]: result = function(*args, **kwargs) [ 677.287715] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 677.287715] env[61439]: return func(*args, **kwargs) [ 677.287715] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 677.287715] env[61439]: raise e [ 677.287947] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 677.287947] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 677.287947] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 677.287947] env[61439]: created_port_ids = self._update_ports_for_instance( [ 677.287947] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 677.287947] env[61439]: with excutils.save_and_reraise_exception(): [ 677.287947] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 677.287947] env[61439]: self.force_reraise() [ 677.287947] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 677.287947] env[61439]: raise self.value [ 677.287947] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 677.287947] env[61439]: updated_port = self._update_port( [ 677.287947] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 677.287947] env[61439]: _ensure_no_port_binding_failure(port) [ 677.287947] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 677.287947] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 677.287947] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. [ 677.287947] env[61439]: Removing descriptor: 22 [ 677.289497] env[61439]: ERROR nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Traceback (most recent call last): [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] yield resources [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self.driver.spawn(context, instance, image_meta, [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] vm_ref = self.build_virtual_machine(instance, [ 677.289497] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] vif_infos = vmwarevif.get_vif_info(self._session, [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] for vif in network_info: [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] return self._sync_wrapper(fn, *args, **kwargs) [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self.wait() [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self[:] = self._gt.wait() [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] return self._exit_event.wait() [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 677.289806] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] result = hub.switch() [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] return self.greenlet.switch() [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] result = function(*args, **kwargs) [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] return func(*args, **kwargs) [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] raise e [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] nwinfo = self.network_api.allocate_for_instance( [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] created_port_ids = self._update_ports_for_instance( [ 677.290271] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] with excutils.save_and_reraise_exception(): [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self.force_reraise() [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] raise self.value [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] updated_port = self._update_port( [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] _ensure_no_port_binding_failure(port) [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] raise exception.PortBindingFailed(port_id=port['id']) [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] nova.exception.PortBindingFailed: Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. [ 677.290665] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] [ 677.291088] env[61439]: INFO nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Terminating instance [ 677.296189] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Acquiring lock "refresh_cache-e32ca59c-5bce-40e5-85e9-0979e673688a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 677.296189] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Acquired lock "refresh_cache-e32ca59c-5bce-40e5-85e9-0979e673688a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 677.296189] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 677.375261] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 677.698185] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 677.718390] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Releasing lock "refresh_cache-e32ca59c-5bce-40e5-85e9-0979e673688a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 677.718390] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 677.718390] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 677.718390] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4d3e0ae0-3cb8-4625-9ba4-05829bd0a962 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 677.730266] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dbb366c-7165-483f-b18b-b14906320914 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 677.762319] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e32ca59c-5bce-40e5-85e9-0979e673688a could not be found. [ 677.763029] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 677.763319] env[61439]: INFO nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 677.764600] env[61439]: DEBUG oslo.service.loopingcall [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 677.764600] env[61439]: DEBUG nova.compute.manager [-] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 677.764600] env[61439]: DEBUG nova.network.neutron [-] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 677.993487] env[61439]: ERROR nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. [ 677.993487] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 677.993487] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 677.993487] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 677.993487] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 677.993487] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 677.993487] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 677.993487] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 677.993487] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 677.993487] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 677.993487] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 677.993487] env[61439]: ERROR nova.compute.manager raise self.value [ 677.993487] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 677.993487] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 677.993487] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 677.993487] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 677.994183] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 677.994183] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 677.994183] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. [ 677.994183] env[61439]: ERROR nova.compute.manager [ 677.994183] env[61439]: Traceback (most recent call last): [ 677.994183] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 677.994183] env[61439]: listener.cb(fileno) [ 677.994183] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 677.994183] env[61439]: result = function(*args, **kwargs) [ 677.994183] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 677.994183] env[61439]: return func(*args, **kwargs) [ 677.994183] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 677.994183] env[61439]: raise e [ 677.994183] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 677.994183] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 677.994183] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 677.994183] env[61439]: created_port_ids = self._update_ports_for_instance( [ 677.994183] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 677.994183] env[61439]: with excutils.save_and_reraise_exception(): [ 677.994183] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 677.994183] env[61439]: self.force_reraise() [ 677.994183] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 677.994183] env[61439]: raise self.value [ 677.994183] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 677.994183] env[61439]: updated_port = self._update_port( [ 677.994183] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 677.994183] env[61439]: _ensure_no_port_binding_failure(port) [ 677.994183] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 677.994183] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 677.995390] env[61439]: nova.exception.PortBindingFailed: Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. [ 677.995390] env[61439]: Removing descriptor: 20 [ 677.995390] env[61439]: ERROR nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Traceback (most recent call last): [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] yield resources [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self.driver.spawn(context, instance, image_meta, [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 677.995390] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] vm_ref = self.build_virtual_machine(instance, [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] vif_infos = vmwarevif.get_vif_info(self._session, [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] for vif in network_info: [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] return self._sync_wrapper(fn, *args, **kwargs) [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self.wait() [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self[:] = self._gt.wait() [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] return self._exit_event.wait() [ 677.995882] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] result = hub.switch() [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] return self.greenlet.switch() [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] result = function(*args, **kwargs) [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] return func(*args, **kwargs) [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] raise e [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] nwinfo = self.network_api.allocate_for_instance( [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 677.996722] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] created_port_ids = self._update_ports_for_instance( [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] with excutils.save_and_reraise_exception(): [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self.force_reraise() [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] raise self.value [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] updated_port = self._update_port( [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] _ensure_no_port_binding_failure(port) [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 677.997083] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] raise exception.PortBindingFailed(port_id=port['id']) [ 677.997423] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] nova.exception.PortBindingFailed: Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. [ 677.997423] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] [ 677.997423] env[61439]: INFO nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Terminating instance [ 677.999605] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "refresh_cache-daf5a018-8f3c-4aa3-bff6-d185c9f125b3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 677.999605] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquired lock "refresh_cache-daf5a018-8f3c-4aa3-bff6-d185c9f125b3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 677.999605] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 678.065631] env[61439]: DEBUG nova.network.neutron [-] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 678.075017] env[61439]: DEBUG nova.network.neutron [-] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 678.090860] env[61439]: INFO nova.compute.manager [-] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Took 0.33 seconds to deallocate network for instance. [ 678.092043] env[61439]: DEBUG nova.compute.claims [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 678.095016] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 678.095016] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 678.104826] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 678.185611] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Successfully created port: b242e1a9-96ac-4229-abd4-34b88a2ae9b0 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 678.238231] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da43cd94-2d98-4360-a3fe-e27b41427b93 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 678.247168] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-883a4b1f-a6d4-4a7c-adcd-2081d90c16ad {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 678.285562] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cd44ccc-b1e8-4219-a947-e63d5b5deeef {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 678.294553] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df5f375f-1127-48f2-9812-e19b3a25ce12 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 678.313099] env[61439]: DEBUG nova.compute.provider_tree [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 678.329159] env[61439]: DEBUG nova.scheduler.client.report [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 678.348190] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.255s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 678.349311] env[61439]: ERROR nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Traceback (most recent call last): [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self.driver.spawn(context, instance, image_meta, [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] vm_ref = self.build_virtual_machine(instance, [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] vif_infos = vmwarevif.get_vif_info(self._session, [ 678.349311] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] for vif in network_info: [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] return self._sync_wrapper(fn, *args, **kwargs) [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self.wait() [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self[:] = self._gt.wait() [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] return self._exit_event.wait() [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] result = hub.switch() [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 678.349688] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] return self.greenlet.switch() [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] result = function(*args, **kwargs) [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] return func(*args, **kwargs) [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] raise e [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] nwinfo = self.network_api.allocate_for_instance( [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] created_port_ids = self._update_ports_for_instance( [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] with excutils.save_and_reraise_exception(): [ 678.350391] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] self.force_reraise() [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] raise self.value [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] updated_port = self._update_port( [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] _ensure_no_port_binding_failure(port) [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] raise exception.PortBindingFailed(port_id=port['id']) [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] nova.exception.PortBindingFailed: Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. [ 678.350875] env[61439]: ERROR nova.compute.manager [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] [ 678.351202] env[61439]: DEBUG nova.compute.utils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 678.353082] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Build of instance e32ca59c-5bce-40e5-85e9-0979e673688a was re-scheduled: Binding failed for port 8a86f436-b6f2-45a2-872f-ea805c774367, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 678.353991] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 678.354387] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Acquiring lock "refresh_cache-e32ca59c-5bce-40e5-85e9-0979e673688a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 678.354614] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Acquired lock "refresh_cache-e32ca59c-5bce-40e5-85e9-0979e673688a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 678.354835] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 678.496059] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 678.666418] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Acquiring lock "429ae3c8-1646-413e-a821-0d37a2ec20e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 678.666655] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Lock "429ae3c8-1646-413e-a821-0d37a2ec20e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 678.682946] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 678.748653] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 678.758590] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 678.758746] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 678.761267] env[61439]: INFO nova.compute.claims [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 678.773334] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Releasing lock "refresh_cache-daf5a018-8f3c-4aa3-bff6-d185c9f125b3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 678.773334] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 678.773334] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 678.773539] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-791020cb-65fe-4bde-b7fd-686666e8d80a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 678.786435] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2888ffa8-b663-4ff9-8a6a-11eb5e5b9952 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 678.806273] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Successfully created port: cbbf3dd4-54e9-4d63-b499-569d917f7cbf {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 678.814624] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 678.826571] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance daf5a018-8f3c-4aa3-bff6-d185c9f125b3 could not be found. [ 678.826898] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 678.827107] env[61439]: INFO nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Took 0.05 seconds to destroy the instance on the hypervisor. [ 678.827497] env[61439]: DEBUG oslo.service.loopingcall [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 678.828797] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Releasing lock "refresh_cache-e32ca59c-5bce-40e5-85e9-0979e673688a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 678.829019] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 678.829202] env[61439]: DEBUG nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 678.829369] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 678.831349] env[61439]: DEBUG nova.compute.manager [-] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 678.831464] env[61439]: DEBUG nova.network.neutron [-] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 678.922690] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 678.937704] env[61439]: DEBUG nova.network.neutron [-] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 678.938944] env[61439]: DEBUG nova.network.neutron [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 678.948659] env[61439]: DEBUG nova.network.neutron [-] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 678.951297] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04974f1e-15b6-4386-a62e-474e225ba13b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 678.956621] env[61439]: INFO nova.compute.manager [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] [instance: e32ca59c-5bce-40e5-85e9-0979e673688a] Took 0.13 seconds to deallocate network for instance. [ 678.960710] env[61439]: INFO nova.compute.manager [-] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Took 0.13 seconds to deallocate network for instance. [ 678.962421] env[61439]: DEBUG nova.compute.claims [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 678.962597] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 678.965950] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ed40c95-e2c1-4aa9-ab1d-f67b1abe4040 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.013734] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d4a36ad-c5b9-45cd-9689-33bc1481e418 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.023933] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2d0c115-b60b-4399-9aca-27a4144bf685 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.042540] env[61439]: DEBUG nova.compute.provider_tree [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 679.053208] env[61439]: DEBUG nova.scheduler.client.report [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 679.073653] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.315s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 679.074040] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 679.078545] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.114s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 679.100192] env[61439]: INFO nova.scheduler.client.report [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Deleted allocations for instance e32ca59c-5bce-40e5-85e9-0979e673688a [ 679.131400] env[61439]: DEBUG oslo_concurrency.lockutils [None req-468bca86-a3ef-4b10-89e0-41b51510fb04 tempest-TenantUsagesTestJSON-325677525 tempest-TenantUsagesTestJSON-325677525-project-member] Lock "e32ca59c-5bce-40e5-85e9-0979e673688a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.970s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 679.136705] env[61439]: DEBUG nova.compute.utils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 679.139556] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 679.139743] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 679.158028] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 679.198691] env[61439]: INFO nova.virt.block_device [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Booting with volume 75a1df95-7638-4f43-a717-35b7645995ea at /dev/sda [ 679.237919] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f86e40c6-3301-47f2-aa60-38d62af8712d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.246809] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22bc455b-81f8-4b4b-898c-7af92e25a218 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.280550] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4030e251-59ab-439f-8465-551829256ddf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.283635] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b4c0789f-d464-4aef-bd93-76b5413448c1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.293063] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef05d85e-b609-475c-8735-6235f36f773a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.299284] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1470fb6d-8bf3-4c85-9585-63fdb6168afa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.321765] env[61439]: DEBUG nova.compute.provider_tree [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 679.327252] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c08c4129-932a-4dd0-9747-951846d3ceae {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.330721] env[61439]: DEBUG nova.scheduler.client.report [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 679.337490] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdb80556-25ed-4ff1-9a25-f6af05467930 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.350665] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.274s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 679.351306] env[61439]: ERROR nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Traceback (most recent call last): [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self.driver.spawn(context, instance, image_meta, [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] vm_ref = self.build_virtual_machine(instance, [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] vif_infos = vmwarevif.get_vif_info(self._session, [ 679.351306] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] for vif in network_info: [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] return self._sync_wrapper(fn, *args, **kwargs) [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self.wait() [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self[:] = self._gt.wait() [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] return self._exit_event.wait() [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] result = hub.switch() [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 679.351825] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] return self.greenlet.switch() [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] result = function(*args, **kwargs) [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] return func(*args, **kwargs) [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] raise e [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] nwinfo = self.network_api.allocate_for_instance( [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] created_port_ids = self._update_ports_for_instance( [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] with excutils.save_and_reraise_exception(): [ 679.352261] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] self.force_reraise() [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] raise self.value [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] updated_port = self._update_port( [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] _ensure_no_port_binding_failure(port) [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] raise exception.PortBindingFailed(port_id=port['id']) [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] nova.exception.PortBindingFailed: Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. [ 679.352670] env[61439]: ERROR nova.compute.manager [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] [ 679.353017] env[61439]: DEBUG nova.compute.utils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 679.353659] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Build of instance daf5a018-8f3c-4aa3-bff6-d185c9f125b3 was re-scheduled: Binding failed for port bae68af0-39c4-46fb-80c7-7a45da13e70a, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 679.354152] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 679.354390] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquiring lock "refresh_cache-daf5a018-8f3c-4aa3-bff6-d185c9f125b3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 679.354538] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Acquired lock "refresh_cache-daf5a018-8f3c-4aa3-bff6-d185c9f125b3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 679.354699] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 679.364599] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4333741-caa1-454b-b778-551ed965654e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.374982] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1105d54-7031-4daa-95f3-9fefc948c626 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.392675] env[61439]: DEBUG nova.virt.block_device [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Updating existing volume attachment record: 850cb627-f806-4c2c-b62d-33905c49bec2 {{(pid=61439) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 679.432627] env[61439]: DEBUG nova.policy [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37ebdb44c9774c7cb2451adfa198e770', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '942a20ed43a149499564de0ae5103aa5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 679.618199] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 679.654156] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 679.654689] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 679.654899] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 679.655072] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 679.655262] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 679.655407] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 679.655551] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 679.657176] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 679.658088] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 679.659676] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 679.659676] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 679.659676] env[61439]: DEBUG nova.virt.hardware [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 679.663343] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d2f96fd-d96b-4c1a-a1c4-4fa07d6b6ea3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.670393] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75c840c4-bf94-4d2c-9630-0f98e5bab1b0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.147113] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 680.164143] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Releasing lock "refresh_cache-daf5a018-8f3c-4aa3-bff6-d185c9f125b3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 680.164143] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 680.164143] env[61439]: DEBUG nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 680.164143] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 680.270220] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 680.285014] env[61439]: DEBUG nova.network.neutron [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 680.298104] env[61439]: INFO nova.compute.manager [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] [instance: daf5a018-8f3c-4aa3-bff6-d185c9f125b3] Took 0.13 seconds to deallocate network for instance. [ 680.454531] env[61439]: INFO nova.scheduler.client.report [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Deleted allocations for instance daf5a018-8f3c-4aa3-bff6-d185c9f125b3 [ 680.484437] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0fdc3667-2428-4992-9035-a94f07f007eb tempest-ListServerFiltersTestJSON-1003122498 tempest-ListServerFiltersTestJSON-1003122498-project-member] Lock "daf5a018-8f3c-4aa3-bff6-d185c9f125b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.744s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 681.169567] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "0e994334-34bc-4c80-8825-8cfec714f81f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 681.169903] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "0e994334-34bc-4c80-8825-8cfec714f81f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 681.184410] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 681.278187] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 681.278187] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 681.280033] env[61439]: INFO nova.compute.claims [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 681.507380] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68a30ea9-c30a-4cb9-a77c-a192e9b09da8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.520827] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-281dd687-2db0-4ba9-bd95-b85cb685a55d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.569643] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7070a307-88d8-4b28-b846-51d707cfaf3e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.579750] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d06ec01-2768-4454-8490-7b385977c105 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.595912] env[61439]: DEBUG nova.compute.provider_tree [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 681.620182] env[61439]: DEBUG nova.scheduler.client.report [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 681.649641] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 681.649641] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 681.712878] env[61439]: DEBUG nova.compute.utils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 681.714286] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 681.716056] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 681.739806] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 681.868752] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 681.900798] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:05:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3c1e456d-60ee-4f5e-912d-5548c2f9847f',id=24,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1422984662',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 681.901055] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 681.901216] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 681.901401] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 681.902488] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 681.902488] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 681.902488] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 681.903283] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 681.903486] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 681.903714] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 681.903996] env[61439]: DEBUG nova.virt.hardware [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 681.905397] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51f11b7c-ad7a-46dc-bc72-9e037cf86795 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.916603] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b915874b-a1da-40db-800d-4dccc85d36eb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.058870] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Successfully created port: 0d60095c-bda7-480f-944a-27d6f3cfd665 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 682.374881] env[61439]: DEBUG nova.policy [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfae70c3a6a94df4a6689ac18626fe37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '870604e0902d43528453a2950d1080d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 684.093456] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Successfully created port: 3e6a70c5-5d66-4536-bff8-68f7bc1921c8 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 685.235975] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquiring lock "a41cb33f-8340-4b15-b19d-0a7b9396eae7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 685.235975] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Lock "a41cb33f-8340-4b15-b19d-0a7b9396eae7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 685.256521] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 685.356683] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 685.356985] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 685.358540] env[61439]: INFO nova.compute.claims [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 685.566587] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad056d31-2652-498e-b18f-c476bdf83be3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.574731] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f712f9a3-2878-4b75-80b0-3bddcee64cb1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.608568] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1f7245f-0010-49af-916e-eafa566e5dc0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.634467] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44696168-c7b5-4df4-9be1-7c9db0a2359a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.655175] env[61439]: DEBUG nova.compute.provider_tree [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 685.668084] env[61439]: DEBUG nova.scheduler.client.report [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 685.690228] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 685.690228] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 685.720310] env[61439]: ERROR nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. [ 685.720310] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 685.720310] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 685.720310] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 685.720310] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 685.720310] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 685.720310] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 685.720310] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 685.720310] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 685.720310] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 685.720310] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 685.720310] env[61439]: ERROR nova.compute.manager raise self.value [ 685.720310] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 685.720310] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 685.720310] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 685.720310] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 685.721232] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 685.721232] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 685.721232] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. [ 685.721232] env[61439]: ERROR nova.compute.manager [ 685.721232] env[61439]: Traceback (most recent call last): [ 685.721232] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 685.721232] env[61439]: listener.cb(fileno) [ 685.721232] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 685.721232] env[61439]: result = function(*args, **kwargs) [ 685.721232] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 685.721232] env[61439]: return func(*args, **kwargs) [ 685.721232] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 685.721232] env[61439]: raise e [ 685.721232] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 685.721232] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 685.721232] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 685.721232] env[61439]: created_port_ids = self._update_ports_for_instance( [ 685.721232] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 685.721232] env[61439]: with excutils.save_and_reraise_exception(): [ 685.721232] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 685.721232] env[61439]: self.force_reraise() [ 685.721232] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 685.721232] env[61439]: raise self.value [ 685.721232] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 685.721232] env[61439]: updated_port = self._update_port( [ 685.721232] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 685.721232] env[61439]: _ensure_no_port_binding_failure(port) [ 685.721232] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 685.721232] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 685.722594] env[61439]: nova.exception.PortBindingFailed: Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. [ 685.722594] env[61439]: Removing descriptor: 10 [ 685.722594] env[61439]: ERROR nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] Traceback (most recent call last): [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] yield resources [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self.driver.spawn(context, instance, image_meta, [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self._vmops.spawn(context, instance, image_meta, injected_files, [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 685.722594] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] vm_ref = self.build_virtual_machine(instance, [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] vif_infos = vmwarevif.get_vif_info(self._session, [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] for vif in network_info: [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] return self._sync_wrapper(fn, *args, **kwargs) [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self.wait() [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self[:] = self._gt.wait() [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] return self._exit_event.wait() [ 685.723152] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] result = hub.switch() [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] return self.greenlet.switch() [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] result = function(*args, **kwargs) [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] return func(*args, **kwargs) [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] raise e [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] nwinfo = self.network_api.allocate_for_instance( [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 685.723710] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] created_port_ids = self._update_ports_for_instance( [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] with excutils.save_and_reraise_exception(): [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self.force_reraise() [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] raise self.value [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] updated_port = self._update_port( [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] _ensure_no_port_binding_failure(port) [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 685.724578] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] raise exception.PortBindingFailed(port_id=port['id']) [ 685.724917] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] nova.exception.PortBindingFailed: Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. [ 685.724917] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] [ 685.724917] env[61439]: INFO nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Terminating instance [ 685.724917] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Acquiring lock "refresh_cache-f66615e9-2e8c-43d9-be73-154398e26934" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 685.724917] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Acquired lock "refresh_cache-f66615e9-2e8c-43d9-be73-154398e26934" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 685.724917] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 685.742089] env[61439]: DEBUG nova.compute.utils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 685.743513] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Not allocating networking since 'none' was specified. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 685.755895] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 685.819396] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 685.848049] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 685.884099] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 685.884318] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 685.884502] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 685.884717] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 685.885020] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 685.885213] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 685.885465] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 685.885650] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 685.885845] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 685.886043] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 685.886223] env[61439]: DEBUG nova.virt.hardware [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 685.887457] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c32b5858-569b-4b37-b67c-013d3b8f4a68 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.897151] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cf10157-bbf8-4d0f-878f-4dacaed9bd4b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.915122] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Instance VIF info [] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 685.922143] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Creating folder: Project (478c83960c4a4bf1b9e93a534ea7528e). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 685.922900] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-59695ee1-5144-401a-9ab9-3696f1235ba6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.936147] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Created folder: Project (478c83960c4a4bf1b9e93a534ea7528e) in parent group-v221281. [ 685.936365] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Creating folder: Instances. Parent ref: group-v221289. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 685.937191] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e404a215-f50f-43f5-abef-13ffe2888e53 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.950868] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Created folder: Instances in parent group-v221289. [ 685.950970] env[61439]: DEBUG oslo.service.loopingcall [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 685.951203] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 685.951388] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7949f717-d063-4da6-a014-d5ef073f3e39 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 685.975019] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 685.975019] env[61439]: value = "task-987654" [ 685.975019] env[61439]: _type = "Task" [ 685.975019] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 685.985657] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987654, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 686.343034] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Acquiring lock "8f553865-312c-4b6a-9383-43f2cbcd0b5e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 686.343541] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Lock "8f553865-312c-4b6a-9383-43f2cbcd0b5e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 686.360980] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 686.447238] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 686.447340] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 686.450813] env[61439]: INFO nova.compute.claims [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 686.490666] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987654, 'name': CreateVM_Task, 'duration_secs': 0.297433} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 686.491086] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 686.491571] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 686.491903] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 686.492322] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 686.493977] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-157c6599-9faa-4aa6-99f8-99501c1c9844 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.500595] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Waiting for the task: (returnval){ [ 686.500595] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52e8d3ce-f40d-ae5a-bf07-308234ab3bcd" [ 686.500595] env[61439]: _type = "Task" [ 686.500595] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 686.516731] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52e8d3ce-f40d-ae5a-bf07-308234ab3bcd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 686.647241] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 686.671160] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Releasing lock "refresh_cache-f66615e9-2e8c-43d9-be73-154398e26934" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 686.671601] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 686.671785] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 686.673569] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b0eec2cc-43fb-4aa0-b22e-14caa3ea41e4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.688653] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3ca8c8a-30e4-4165-b3e8-2394b7831523 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.709680] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f26b6d9-ac0f-4f68-bca9-df676caffde5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.720777] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fee3de5-1717-40b5-a0b4-c1222b47cf99 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.731074] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f66615e9-2e8c-43d9-be73-154398e26934 could not be found. [ 686.731825] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 686.732098] env[61439]: INFO nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Took 0.06 seconds to destroy the instance on the hypervisor. [ 686.732332] env[61439]: DEBUG oslo.service.loopingcall [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 686.732681] env[61439]: DEBUG nova.compute.manager [-] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 686.732681] env[61439]: DEBUG nova.network.neutron [-] [instance: f66615e9-2e8c-43d9-be73-154398e26934] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 686.760642] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72404cf6-92c8-470d-b2e3-c9b547086b7e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.770056] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dba9679-e31a-49ca-9fd5-9d27bbe57ad4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.788081] env[61439]: DEBUG nova.compute.provider_tree [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 686.799212] env[61439]: DEBUG nova.scheduler.client.report [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 686.819467] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 686.819738] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 686.843632] env[61439]: DEBUG nova.network.neutron [-] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 686.858034] env[61439]: DEBUG nova.network.neutron [-] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 686.879287] env[61439]: INFO nova.compute.manager [-] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Took 0.14 seconds to deallocate network for instance. [ 686.879287] env[61439]: DEBUG nova.compute.claims [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 686.879287] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 686.879287] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 686.892181] env[61439]: ERROR nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. [ 686.892181] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 686.892181] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 686.892181] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 686.892181] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 686.892181] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 686.892181] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 686.892181] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 686.892181] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 686.892181] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 686.892181] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 686.892181] env[61439]: ERROR nova.compute.manager raise self.value [ 686.892181] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 686.892181] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 686.892181] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 686.892181] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 686.892711] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 686.892711] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 686.892711] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. [ 686.892711] env[61439]: ERROR nova.compute.manager [ 686.892711] env[61439]: Traceback (most recent call last): [ 686.892711] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 686.892711] env[61439]: listener.cb(fileno) [ 686.892711] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 686.892711] env[61439]: result = function(*args, **kwargs) [ 686.892711] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 686.892711] env[61439]: return func(*args, **kwargs) [ 686.892711] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 686.892711] env[61439]: raise e [ 686.892711] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 686.892711] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 686.892711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 686.892711] env[61439]: created_port_ids = self._update_ports_for_instance( [ 686.892711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 686.892711] env[61439]: with excutils.save_and_reraise_exception(): [ 686.892711] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 686.892711] env[61439]: self.force_reraise() [ 686.892711] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 686.892711] env[61439]: raise self.value [ 686.892711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 686.892711] env[61439]: updated_port = self._update_port( [ 686.892711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 686.892711] env[61439]: _ensure_no_port_binding_failure(port) [ 686.892711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 686.892711] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 686.893506] env[61439]: nova.exception.PortBindingFailed: Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. [ 686.893506] env[61439]: Removing descriptor: 24 [ 686.893811] env[61439]: DEBUG nova.compute.utils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 686.896340] env[61439]: ERROR nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Traceback (most recent call last): [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] yield resources [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self.driver.spawn(context, instance, image_meta, [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self._vmops.spawn(context, instance, image_meta, injected_files, [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] vm_ref = self.build_virtual_machine(instance, [ 686.896340] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] vif_infos = vmwarevif.get_vif_info(self._session, [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] for vif in network_info: [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] return self._sync_wrapper(fn, *args, **kwargs) [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self.wait() [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self[:] = self._gt.wait() [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] return self._exit_event.wait() [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 686.896729] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] result = hub.switch() [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] return self.greenlet.switch() [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] result = function(*args, **kwargs) [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] return func(*args, **kwargs) [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] raise e [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] nwinfo = self.network_api.allocate_for_instance( [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] created_port_ids = self._update_ports_for_instance( [ 686.897144] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] with excutils.save_and_reraise_exception(): [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self.force_reraise() [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] raise self.value [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] updated_port = self._update_port( [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] _ensure_no_port_binding_failure(port) [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] raise exception.PortBindingFailed(port_id=port['id']) [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] nova.exception.PortBindingFailed: Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. [ 686.897559] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] [ 686.898076] env[61439]: INFO nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Terminating instance [ 686.898076] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 686.903168] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 686.906125] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Acquiring lock "refresh_cache-4998340d-3afc-4fc7-bc2a-57913b534d36" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 686.906603] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Acquired lock "refresh_cache-4998340d-3afc-4fc7-bc2a-57913b534d36" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 686.906603] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 686.918026] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 687.025588] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 687.026091] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 687.026672] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 687.049231] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 687.065221] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 687.112183] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='917952258',id=23,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-621946460',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 687.112616] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 687.112616] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 687.112743] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 687.112826] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 687.112971] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 687.113186] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 687.113381] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 687.114037] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 687.114037] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 687.114037] env[61439]: DEBUG nova.virt.hardware [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 687.115522] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660d599e-cf27-4545-b4fa-ef4afae5eb79 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.127822] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-749b7a8e-9c14-487c-b703-871a1d578298 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.178241] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-157cd305-e63e-4af0-a37f-b9ef45c1ed90 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.186895] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cec7b0a-cac1-4354-8931-3c82ebdfab47 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.221556] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8fcca6b-5e06-4367-a5a6-1a05567e556f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.230470] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e9d970d-07bc-4653-9828-eac0b6fd6a6f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.245911] env[61439]: DEBUG nova.compute.provider_tree [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 687.259519] env[61439]: DEBUG nova.scheduler.client.report [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 687.278108] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.399s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 687.278291] env[61439]: ERROR nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] Traceback (most recent call last): [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self.driver.spawn(context, instance, image_meta, [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self._vmops.spawn(context, instance, image_meta, injected_files, [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] vm_ref = self.build_virtual_machine(instance, [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] vif_infos = vmwarevif.get_vif_info(self._session, [ 687.278291] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] for vif in network_info: [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] return self._sync_wrapper(fn, *args, **kwargs) [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self.wait() [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self[:] = self._gt.wait() [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] return self._exit_event.wait() [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] result = hub.switch() [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 687.278660] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] return self.greenlet.switch() [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] result = function(*args, **kwargs) [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] return func(*args, **kwargs) [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] raise e [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] nwinfo = self.network_api.allocate_for_instance( [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] created_port_ids = self._update_ports_for_instance( [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] with excutils.save_and_reraise_exception(): [ 687.279082] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] self.force_reraise() [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] raise self.value [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] updated_port = self._update_port( [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] _ensure_no_port_binding_failure(port) [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] raise exception.PortBindingFailed(port_id=port['id']) [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] nova.exception.PortBindingFailed: Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. [ 687.279483] env[61439]: ERROR nova.compute.manager [instance: f66615e9-2e8c-43d9-be73-154398e26934] [ 687.279833] env[61439]: DEBUG nova.compute.utils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 687.283044] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Build of instance f66615e9-2e8c-43d9-be73-154398e26934 was re-scheduled: Binding failed for port cbbf3dd4-54e9-4d63-b499-569d917f7cbf, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 687.283044] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 687.283044] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Acquiring lock "refresh_cache-f66615e9-2e8c-43d9-be73-154398e26934" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 687.283044] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Acquired lock "refresh_cache-f66615e9-2e8c-43d9-be73-154398e26934" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 687.283443] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 687.302296] env[61439]: DEBUG nova.policy [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4cedace28efb449abe5bc60d4f6b3b7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fc06e5943835421bb7461a395d058b90', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 687.386471] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 687.567725] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 687.593213] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Releasing lock "refresh_cache-4998340d-3afc-4fc7-bc2a-57913b534d36" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 687.593885] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 687.594392] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 687.596087] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a15fa21a-cfef-45db-a039-a038dcaf545c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.611371] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18217360-10c2-450c-b86c-ea32e39f1956 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.647699] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4998340d-3afc-4fc7-bc2a-57913b534d36 could not be found. [ 687.648468] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 687.648468] env[61439]: INFO nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Took 0.05 seconds to destroy the instance on the hypervisor. [ 687.648657] env[61439]: DEBUG oslo.service.loopingcall [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 687.648876] env[61439]: DEBUG nova.compute.manager [-] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 687.648973] env[61439]: DEBUG nova.network.neutron [-] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 687.733435] env[61439]: DEBUG nova.network.neutron [-] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 687.742170] env[61439]: DEBUG nova.network.neutron [-] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 687.761400] env[61439]: INFO nova.compute.manager [-] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Took 0.11 seconds to deallocate network for instance. [ 687.763189] env[61439]: DEBUG nova.compute.claims [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 687.764020] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 687.764020] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 688.047807] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ffd53a3-5dc7-4121-a74d-6222ee186541 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.061015] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8d14641-a4e4-4310-a937-cc080c55e779 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.094330] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-339e1308-48be-4ae3-ba87-da8cbff7f3e1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.103903] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-341cf78c-4f98-413f-bdb5-617875fb8bf4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.120325] env[61439]: DEBUG nova.compute.provider_tree [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 688.135136] env[61439]: DEBUG nova.scheduler.client.report [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 688.157785] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.394s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 688.158546] env[61439]: ERROR nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Traceback (most recent call last): [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self.driver.spawn(context, instance, image_meta, [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self._vmops.spawn(context, instance, image_meta, injected_files, [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] vm_ref = self.build_virtual_machine(instance, [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] vif_infos = vmwarevif.get_vif_info(self._session, [ 688.158546] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] for vif in network_info: [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] return self._sync_wrapper(fn, *args, **kwargs) [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self.wait() [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self[:] = self._gt.wait() [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] return self._exit_event.wait() [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] result = hub.switch() [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 688.158957] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] return self.greenlet.switch() [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] result = function(*args, **kwargs) [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] return func(*args, **kwargs) [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] raise e [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] nwinfo = self.network_api.allocate_for_instance( [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] created_port_ids = self._update_ports_for_instance( [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] with excutils.save_and_reraise_exception(): [ 688.159321] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] self.force_reraise() [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] raise self.value [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] updated_port = self._update_port( [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] _ensure_no_port_binding_failure(port) [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] raise exception.PortBindingFailed(port_id=port['id']) [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] nova.exception.PortBindingFailed: Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. [ 688.159730] env[61439]: ERROR nova.compute.manager [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] [ 688.160047] env[61439]: DEBUG nova.compute.utils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 688.161719] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Build of instance 4998340d-3afc-4fc7-bc2a-57913b534d36 was re-scheduled: Binding failed for port b242e1a9-96ac-4229-abd4-34b88a2ae9b0, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 688.162167] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 688.162507] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Acquiring lock "refresh_cache-4998340d-3afc-4fc7-bc2a-57913b534d36" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 688.163054] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Acquired lock "refresh_cache-4998340d-3afc-4fc7-bc2a-57913b534d36" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 688.163054] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 688.283574] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 688.295144] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 688.300364] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Releasing lock "refresh_cache-f66615e9-2e8c-43d9-be73-154398e26934" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 688.300512] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 688.300652] env[61439]: DEBUG nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 688.300813] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 688.384311] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 688.399262] env[61439]: DEBUG nova.network.neutron [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 688.412592] env[61439]: INFO nova.compute.manager [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] [instance: f66615e9-2e8c-43d9-be73-154398e26934] Took 0.11 seconds to deallocate network for instance. [ 688.535771] env[61439]: WARNING oslo_vmware.rw_handles [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 688.535771] env[61439]: ERROR oslo_vmware.rw_handles [ 688.536249] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 688.538743] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 688.539046] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Copying Virtual Disk [datastore2] vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/d1d03143-68bf-4adf-89ed-e27f21cbd362/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 688.539613] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b4280f3a-8bf2-4e79-b5cc-7b5b8adfb4ed {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.545935] env[61439]: INFO nova.scheduler.client.report [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Deleted allocations for instance f66615e9-2e8c-43d9-be73-154398e26934 [ 688.556019] env[61439]: DEBUG oslo_vmware.api [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Waiting for the task: (returnval){ [ 688.556019] env[61439]: value = "task-987655" [ 688.556019] env[61439]: _type = "Task" [ 688.556019] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 688.566311] env[61439]: DEBUG oslo_vmware.api [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Task: {'id': task-987655, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 688.571396] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f72ed652-bc39-4027-8d10-f95153c80ab0 tempest-VolumesAssistedSnapshotsTest-593068978 tempest-VolumesAssistedSnapshotsTest-593068978-project-member] Lock "f66615e9-2e8c-43d9-be73-154398e26934" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.322s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 689.069522] env[61439]: DEBUG oslo_vmware.exceptions [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 689.069522] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 689.069967] env[61439]: ERROR nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 689.069967] env[61439]: Faults: ['InvalidArgument'] [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Traceback (most recent call last): [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] yield resources [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] self.driver.spawn(context, instance, image_meta, [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] self._vmops.spawn(context, instance, image_meta, injected_files, [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] self._fetch_image_if_missing(context, vi) [ 689.069967] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] image_cache(vi, tmp_image_ds_loc) [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] vm_util.copy_virtual_disk( [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] session._wait_for_task(vmdk_copy_task) [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] return self.wait_for_task(task_ref) [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] return evt.wait() [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] result = hub.switch() [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 689.070440] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] return self.greenlet.switch() [ 689.073836] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 689.073836] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] self.f(*self.args, **self.kw) [ 689.073836] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 689.073836] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] raise exceptions.translate_fault(task_info.error) [ 689.073836] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 689.073836] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Faults: ['InvalidArgument'] [ 689.073836] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] [ 689.073836] env[61439]: INFO nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Terminating instance [ 689.073836] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 689.074128] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 689.074128] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquiring lock "refresh_cache-2afc8edb-3331-476a-bda3-4f8071461084" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 689.074128] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquired lock "refresh_cache-2afc8edb-3331-476a-bda3-4f8071461084" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 689.074128] env[61439]: DEBUG nova.network.neutron [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 689.075178] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e4dd29f6-224f-4c38-aa57-10cd792003c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.085915] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 689.086098] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 689.087443] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cbc76c4c-efce-4e40-9e11-52d0f32e0a75 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.099148] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Waiting for the task: (returnval){ [ 689.099148] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]526fab16-c05d-49d1-995e-50664b778aa7" [ 689.099148] env[61439]: _type = "Task" [ 689.099148] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 689.110634] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]526fab16-c05d-49d1-995e-50664b778aa7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 689.147771] env[61439]: DEBUG nova.network.neutron [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 689.361228] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 689.378033] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Releasing lock "refresh_cache-4998340d-3afc-4fc7-bc2a-57913b534d36" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 689.378033] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 689.378033] env[61439]: DEBUG nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 689.378033] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 689.481731] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 689.495034] env[61439]: DEBUG nova.network.neutron [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 689.496262] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Successfully created port: 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 689.506542] env[61439]: INFO nova.compute.manager [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] [instance: 4998340d-3afc-4fc7-bc2a-57913b534d36] Took 0.13 seconds to deallocate network for instance. [ 689.556044] env[61439]: DEBUG nova.network.neutron [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 689.564734] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Releasing lock "refresh_cache-2afc8edb-3331-476a-bda3-4f8071461084" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 689.565146] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 689.565581] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 689.566399] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99f5e01a-279a-4ef5-974e-b726d8fa2684 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.577288] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 689.577549] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-43938a57-07c9-464e-b97f-4a48750e5356 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.618283] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 689.618585] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Creating directory with path [datastore2] vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 689.619181] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1ec9c2df-042f-43c4-ab2f-425d65a6d0fb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.634624] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 689.634818] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 689.635011] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Deleting the datastore file [datastore2] 2afc8edb-3331-476a-bda3-4f8071461084 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 689.635274] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c0b6cc44-08c4-4622-838b-8bd5db2dc185 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.639612] env[61439]: INFO nova.scheduler.client.report [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Deleted allocations for instance 4998340d-3afc-4fc7-bc2a-57913b534d36 [ 689.649134] env[61439]: DEBUG oslo_vmware.api [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Waiting for the task: (returnval){ [ 689.649134] env[61439]: value = "task-987657" [ 689.649134] env[61439]: _type = "Task" [ 689.649134] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 689.649938] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Created directory with path [datastore2] vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 689.649938] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Fetch image to [datastore2] vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 689.649938] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 689.652470] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de6e5c2e-04c4-44bd-97d3-68c48d16d06c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.668665] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7a9e6b0-0045-47fe-b1bc-cbb38c56d469 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.673500] env[61439]: DEBUG oslo_concurrency.lockutils [None req-93424e1c-6008-4427-8041-e87d2479003c tempest-ImagesOneServerTestJSON-2133586259 tempest-ImagesOneServerTestJSON-2133586259-project-member] Lock "4998340d-3afc-4fc7-bc2a-57913b534d36" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.245s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 689.682410] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd87fcac-22b0-42eb-8564-a9303d5a3ad7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.715471] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57662875-128c-47b7-9d6e-01b8e8c7db97 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.722868] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-879a8060-726c-4d31-9161-613e1fe41a63 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.748428] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 689.822362] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 689.889856] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 689.890057] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 690.163961] env[61439]: DEBUG oslo_vmware.api [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Task: {'id': task-987657, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.058321} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 690.164249] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 690.164575] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 690.167753] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 690.167753] env[61439]: INFO nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Took 0.60 seconds to destroy the instance on the hypervisor. [ 690.167753] env[61439]: DEBUG oslo.service.loopingcall [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 690.167753] env[61439]: DEBUG nova.compute.manager [-] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 690.169424] env[61439]: DEBUG nova.compute.claims [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 690.169680] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 690.169999] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 690.323229] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74fbaf93-790f-499c-a4fe-0ed05edf8ea3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.332324] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de6d2250-db97-4c03-8da8-d2e63c264929 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.369098] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4465ca17-744c-4840-aff5-2f85bb7e183e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.380596] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc10b370-f7b9-49eb-9961-e5c57a256a3c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.398819] env[61439]: DEBUG nova.compute.provider_tree [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.412681] env[61439]: DEBUG nova.scheduler.client.report [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.432324] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.262s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 690.433011] env[61439]: ERROR nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 690.433011] env[61439]: Faults: ['InvalidArgument'] [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Traceback (most recent call last): [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] self.driver.spawn(context, instance, image_meta, [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] self._vmops.spawn(context, instance, image_meta, injected_files, [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] self._fetch_image_if_missing(context, vi) [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] image_cache(vi, tmp_image_ds_loc) [ 690.433011] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] vm_util.copy_virtual_disk( [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] session._wait_for_task(vmdk_copy_task) [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] return self.wait_for_task(task_ref) [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] return evt.wait() [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] result = hub.switch() [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] return self.greenlet.switch() [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 690.433712] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] self.f(*self.args, **self.kw) [ 690.434112] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 690.434112] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] raise exceptions.translate_fault(task_info.error) [ 690.434112] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 690.434112] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Faults: ['InvalidArgument'] [ 690.434112] env[61439]: ERROR nova.compute.manager [instance: 2afc8edb-3331-476a-bda3-4f8071461084] [ 690.434112] env[61439]: DEBUG nova.compute.utils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 690.435480] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Build of instance 2afc8edb-3331-476a-bda3-4f8071461084 was re-scheduled: A specified parameter was not correct: fileType [ 690.435480] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 690.435996] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 690.436168] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquiring lock "refresh_cache-2afc8edb-3331-476a-bda3-4f8071461084" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 690.436353] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Acquired lock "refresh_cache-2afc8edb-3331-476a-bda3-4f8071461084" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 690.436740] env[61439]: DEBUG nova.network.neutron [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 690.553843] env[61439]: DEBUG nova.network.neutron [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 690.910205] env[61439]: ERROR nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. [ 690.910205] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 690.910205] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 690.910205] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 690.910205] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 690.910205] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 690.910205] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 690.910205] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 690.910205] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 690.910205] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 690.910205] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 690.910205] env[61439]: ERROR nova.compute.manager raise self.value [ 690.910205] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 690.910205] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 690.910205] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 690.910205] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 690.910769] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 690.910769] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 690.910769] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. [ 690.910769] env[61439]: ERROR nova.compute.manager [ 690.910769] env[61439]: Traceback (most recent call last): [ 690.910769] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 690.910769] env[61439]: listener.cb(fileno) [ 690.910769] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 690.910769] env[61439]: result = function(*args, **kwargs) [ 690.910769] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 690.910769] env[61439]: return func(*args, **kwargs) [ 690.910769] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 690.910769] env[61439]: raise e [ 690.910769] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 690.910769] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 690.910769] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 690.910769] env[61439]: created_port_ids = self._update_ports_for_instance( [ 690.910769] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 690.910769] env[61439]: with excutils.save_and_reraise_exception(): [ 690.910769] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 690.910769] env[61439]: self.force_reraise() [ 690.910769] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 690.910769] env[61439]: raise self.value [ 690.910769] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 690.910769] env[61439]: updated_port = self._update_port( [ 690.910769] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 690.910769] env[61439]: _ensure_no_port_binding_failure(port) [ 690.910769] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 690.910769] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 690.911576] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. [ 690.911576] env[61439]: Removing descriptor: 20 [ 690.911576] env[61439]: ERROR nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Traceback (most recent call last): [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] yield resources [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self.driver.spawn(context, instance, image_meta, [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 690.911576] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] vm_ref = self.build_virtual_machine(instance, [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] vif_infos = vmwarevif.get_vif_info(self._session, [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] for vif in network_info: [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] return self._sync_wrapper(fn, *args, **kwargs) [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self.wait() [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self[:] = self._gt.wait() [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] return self._exit_event.wait() [ 690.911985] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] result = hub.switch() [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] return self.greenlet.switch() [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] result = function(*args, **kwargs) [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] return func(*args, **kwargs) [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] raise e [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] nwinfo = self.network_api.allocate_for_instance( [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 690.912418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] created_port_ids = self._update_ports_for_instance( [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] with excutils.save_and_reraise_exception(): [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self.force_reraise() [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] raise self.value [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] updated_port = self._update_port( [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] _ensure_no_port_binding_failure(port) [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 690.912877] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] raise exception.PortBindingFailed(port_id=port['id']) [ 690.913274] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] nova.exception.PortBindingFailed: Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. [ 690.913274] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] [ 690.913274] env[61439]: INFO nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Terminating instance [ 690.913274] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Acquiring lock "refresh_cache-429ae3c8-1646-413e-a821-0d37a2ec20e4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 690.913274] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Acquired lock "refresh_cache-429ae3c8-1646-413e-a821-0d37a2ec20e4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 690.913274] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 690.965873] env[61439]: DEBUG nova.network.neutron [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 690.977423] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Releasing lock "refresh_cache-2afc8edb-3331-476a-bda3-4f8071461084" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 690.977695] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 690.977891] env[61439]: DEBUG nova.compute.manager [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] [instance: 2afc8edb-3331-476a-bda3-4f8071461084] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 691.019340] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 691.124552] env[61439]: INFO nova.scheduler.client.report [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Deleted allocations for instance 2afc8edb-3331-476a-bda3-4f8071461084 [ 691.158192] env[61439]: DEBUG oslo_concurrency.lockutils [None req-988e22c9-c6db-454a-ba6a-e2e8a94cdef7 tempest-ServersAdmin275Test-1748718107 tempest-ServersAdmin275Test-1748718107-project-member] Lock "2afc8edb-3331-476a-bda3-4f8071461084" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.233s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 691.655970] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 691.668599] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Releasing lock "refresh_cache-429ae3c8-1646-413e-a821-0d37a2ec20e4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 691.669188] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 691.669765] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-91ee762a-f717-49e3-a829-f7f731f04f1d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.681819] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e94cdf-7137-4b2c-811a-36f4fdb2a479 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.714297] env[61439]: WARNING nova.virt.vmwareapi.driver [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance 429ae3c8-1646-413e-a821-0d37a2ec20e4 could not be found. [ 691.714553] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 691.714854] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9a2ac1a7-eadb-4c0e-a8d1-af2792754ec6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.723895] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c3151f7-1143-4643-ae4a-c707db7a8377 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.750506] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 429ae3c8-1646-413e-a821-0d37a2ec20e4 could not be found. [ 691.750992] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 691.750992] env[61439]: INFO nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Took 0.08 seconds to destroy the instance on the hypervisor. [ 691.751318] env[61439]: DEBUG oslo.service.loopingcall [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 691.751790] env[61439]: DEBUG nova.compute.manager [-] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 691.751790] env[61439]: DEBUG nova.network.neutron [-] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 691.812502] env[61439]: DEBUG nova.network.neutron [-] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 691.825376] env[61439]: DEBUG nova.network.neutron [-] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 691.836619] env[61439]: INFO nova.compute.manager [-] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Took 0.08 seconds to deallocate network for instance. [ 691.915978] env[61439]: INFO nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Took 0.08 seconds to detach 1 volumes for instance. [ 691.919285] env[61439]: DEBUG nova.compute.claims [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 691.919285] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 691.919285] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 692.071842] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef57bfce-ace2-4dd5-a32d-647640cf2455 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.081742] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37473ce2-2dc6-4902-9b04-6152111dfa6e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.117950] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd1024c8-d3a1-4c11-91e8-039daf91faf4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.127200] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22180911-3afe-42e2-8b5d-17f765fa9a05 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.142903] env[61439]: DEBUG nova.compute.provider_tree [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 692.155279] env[61439]: DEBUG nova.scheduler.client.report [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 692.172355] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.253s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 692.173258] env[61439]: ERROR nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Traceback (most recent call last): [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self.driver.spawn(context, instance, image_meta, [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] vm_ref = self.build_virtual_machine(instance, [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] vif_infos = vmwarevif.get_vif_info(self._session, [ 692.173258] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] for vif in network_info: [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] return self._sync_wrapper(fn, *args, **kwargs) [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self.wait() [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self[:] = self._gt.wait() [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] return self._exit_event.wait() [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] result = hub.switch() [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 692.173669] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] return self.greenlet.switch() [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] result = function(*args, **kwargs) [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] return func(*args, **kwargs) [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] raise e [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] nwinfo = self.network_api.allocate_for_instance( [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] created_port_ids = self._update_ports_for_instance( [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] with excutils.save_and_reraise_exception(): [ 692.174418] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] self.force_reraise() [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] raise self.value [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] updated_port = self._update_port( [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] _ensure_no_port_binding_failure(port) [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] raise exception.PortBindingFailed(port_id=port['id']) [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] nova.exception.PortBindingFailed: Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. [ 692.174867] env[61439]: ERROR nova.compute.manager [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] [ 692.175229] env[61439]: DEBUG nova.compute.utils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 692.176289] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Build of instance 429ae3c8-1646-413e-a821-0d37a2ec20e4 was re-scheduled: Binding failed for port 0d60095c-bda7-480f-944a-27d6f3cfd665, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 692.176801] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 692.177094] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Acquiring lock "refresh_cache-429ae3c8-1646-413e-a821-0d37a2ec20e4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 692.177286] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Acquired lock "refresh_cache-429ae3c8-1646-413e-a821-0d37a2ec20e4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 692.177485] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 692.265322] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 692.898167] env[61439]: ERROR nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. [ 692.898167] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 692.898167] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 692.898167] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 692.898167] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 692.898167] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 692.898167] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 692.898167] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 692.898167] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 692.898167] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 692.898167] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 692.898167] env[61439]: ERROR nova.compute.manager raise self.value [ 692.898167] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 692.898167] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 692.898167] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 692.898167] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 692.899092] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 692.899092] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 692.899092] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. [ 692.899092] env[61439]: ERROR nova.compute.manager [ 692.899092] env[61439]: Traceback (most recent call last): [ 692.899092] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 692.899092] env[61439]: listener.cb(fileno) [ 692.899092] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 692.899092] env[61439]: result = function(*args, **kwargs) [ 692.899092] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 692.899092] env[61439]: return func(*args, **kwargs) [ 692.899092] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 692.899092] env[61439]: raise e [ 692.899092] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 692.899092] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 692.899092] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 692.899092] env[61439]: created_port_ids = self._update_ports_for_instance( [ 692.899092] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 692.899092] env[61439]: with excutils.save_and_reraise_exception(): [ 692.899092] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 692.899092] env[61439]: self.force_reraise() [ 692.899092] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 692.899092] env[61439]: raise self.value [ 692.899092] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 692.899092] env[61439]: updated_port = self._update_port( [ 692.899092] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 692.899092] env[61439]: _ensure_no_port_binding_failure(port) [ 692.899092] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 692.899092] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 692.899947] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. [ 692.899947] env[61439]: Removing descriptor: 22 [ 692.899947] env[61439]: ERROR nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Traceback (most recent call last): [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] yield resources [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self.driver.spawn(context, instance, image_meta, [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 692.899947] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] vm_ref = self.build_virtual_machine(instance, [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] vif_infos = vmwarevif.get_vif_info(self._session, [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] for vif in network_info: [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] return self._sync_wrapper(fn, *args, **kwargs) [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self.wait() [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self[:] = self._gt.wait() [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] return self._exit_event.wait() [ 692.900508] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] result = hub.switch() [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] return self.greenlet.switch() [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] result = function(*args, **kwargs) [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] return func(*args, **kwargs) [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] raise e [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] nwinfo = self.network_api.allocate_for_instance( [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 692.900870] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] created_port_ids = self._update_ports_for_instance( [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] with excutils.save_and_reraise_exception(): [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self.force_reraise() [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] raise self.value [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] updated_port = self._update_port( [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] _ensure_no_port_binding_failure(port) [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 692.902550] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] raise exception.PortBindingFailed(port_id=port['id']) [ 692.903099] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] nova.exception.PortBindingFailed: Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. [ 692.903099] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] [ 692.903099] env[61439]: INFO nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Terminating instance [ 692.903099] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "refresh_cache-0e994334-34bc-4c80-8825-8cfec714f81f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 692.903099] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquired lock "refresh_cache-0e994334-34bc-4c80-8825-8cfec714f81f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 692.903099] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 692.967156] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 693.023037] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.037206] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Releasing lock "refresh_cache-429ae3c8-1646-413e-a821-0d37a2ec20e4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 693.037502] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 693.037697] env[61439]: DEBUG nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 693.037867] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 693.099570] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 693.109276] env[61439]: DEBUG nova.network.neutron [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.124685] env[61439]: INFO nova.compute.manager [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] [instance: 429ae3c8-1646-413e-a821-0d37a2ec20e4] Took 0.08 seconds to deallocate network for instance. [ 693.235548] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.241169] env[61439]: INFO nova.scheduler.client.report [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Deleted allocations for instance 429ae3c8-1646-413e-a821-0d37a2ec20e4 [ 693.255883] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Releasing lock "refresh_cache-0e994334-34bc-4c80-8825-8cfec714f81f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 693.255883] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 693.255883] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 693.256275] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-23a3c324-6446-4562-b55c-07a3098fc8d8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.271548] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e92bc28-9b2b-49a4-a712-332240bedce6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.289704] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e8ee6c2-d77e-4d6a-8545-c4900ef915b6 tempest-ServerActionsV293TestJSON-231157793 tempest-ServerActionsV293TestJSON-231157793-project-member] Lock "429ae3c8-1646-413e-a821-0d37a2ec20e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.623s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 693.307136] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0e994334-34bc-4c80-8825-8cfec714f81f could not be found. [ 693.307136] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 693.307277] env[61439]: INFO nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 693.307605] env[61439]: DEBUG oslo.service.loopingcall [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 693.310087] env[61439]: DEBUG nova.compute.manager [-] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 693.310087] env[61439]: DEBUG nova.network.neutron [-] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 693.354772] env[61439]: DEBUG nova.network.neutron [-] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 693.365349] env[61439]: DEBUG nova.network.neutron [-] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.379599] env[61439]: INFO nova.compute.manager [-] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Took 0.07 seconds to deallocate network for instance. [ 693.384442] env[61439]: DEBUG nova.compute.claims [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 693.387018] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.387018] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.581782] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55960944-a89a-4aee-a1e6-c45436822115 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.592148] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a25c4b7-b533-48db-9ccf-07b2bb9a0eea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.628864] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d35da3e-07bc-4977-b3d4-7a36c97c921c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.638129] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc371c22-fc03-4783-a266-13bcb460a21c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.657398] env[61439]: DEBUG nova.compute.provider_tree [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 693.675347] env[61439]: DEBUG nova.scheduler.client.report [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 693.694918] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.309s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 693.695544] env[61439]: ERROR nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Traceback (most recent call last): [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self.driver.spawn(context, instance, image_meta, [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] vm_ref = self.build_virtual_machine(instance, [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] vif_infos = vmwarevif.get_vif_info(self._session, [ 693.695544] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] for vif in network_info: [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] return self._sync_wrapper(fn, *args, **kwargs) [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self.wait() [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self[:] = self._gt.wait() [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] return self._exit_event.wait() [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] result = hub.switch() [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 693.696422] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] return self.greenlet.switch() [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] result = function(*args, **kwargs) [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] return func(*args, **kwargs) [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] raise e [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] nwinfo = self.network_api.allocate_for_instance( [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] created_port_ids = self._update_ports_for_instance( [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] with excutils.save_and_reraise_exception(): [ 693.697729] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] self.force_reraise() [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] raise self.value [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] updated_port = self._update_port( [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] _ensure_no_port_binding_failure(port) [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] raise exception.PortBindingFailed(port_id=port['id']) [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] nova.exception.PortBindingFailed: Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. [ 693.698421] env[61439]: ERROR nova.compute.manager [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] [ 693.698953] env[61439]: DEBUG nova.compute.utils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 693.698953] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Build of instance 0e994334-34bc-4c80-8825-8cfec714f81f was re-scheduled: Binding failed for port 3e6a70c5-5d66-4536-bff8-68f7bc1921c8, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 693.698953] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 693.700580] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquiring lock "refresh_cache-0e994334-34bc-4c80-8825-8cfec714f81f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 693.700580] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Acquired lock "refresh_cache-0e994334-34bc-4c80-8825-8cfec714f81f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 693.700580] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 693.788031] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Acquiring lock "4a2752b8-156a-4ec2-810c-3d6e7bc8554e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.788271] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Lock "4a2752b8-156a-4ec2-810c-3d6e7bc8554e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.800062] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 693.866191] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.866782] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 693.868265] env[61439]: INFO nova.compute.claims [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 693.979366] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 694.002669] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa647399-75f2-410c-82c7-7617a9a54d48 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.013476] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74fe805-f273-4812-a894-ad141a8eb017 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.054020] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a78f7924-5022-41b3-a763-2e4760450dbf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.062681] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac104c12-a9a1-4ad2-a870-5ac9a8c2f669 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.080154] env[61439]: DEBUG nova.compute.provider_tree [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 694.095597] env[61439]: DEBUG nova.scheduler.client.report [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 694.112498] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 694.113008] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 694.147958] env[61439]: DEBUG nova.compute.utils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 694.149280] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 694.149591] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 694.159281] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 694.223641] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 694.231107] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 694.235381] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Releasing lock "refresh_cache-0e994334-34bc-4c80-8825-8cfec714f81f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 694.235611] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 694.235774] env[61439]: DEBUG nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 694.235939] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 694.260819] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 694.260819] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 694.260819] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 694.260996] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 694.261400] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 694.261717] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 694.262077] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 694.262403] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 694.262708] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 694.263011] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 694.263327] env[61439]: DEBUG nova.virt.hardware [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 694.264738] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ad902f0-14f7-4f02-95b5-bb208d35488d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.278111] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-243d25e6-c9b9-4683-9d11-06098dc38a39 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.306535] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 694.313294] env[61439]: DEBUG nova.network.neutron [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 694.323446] env[61439]: INFO nova.compute.manager [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] [instance: 0e994334-34bc-4c80-8825-8cfec714f81f] Took 0.09 seconds to deallocate network for instance. [ 694.337702] env[61439]: DEBUG nova.policy [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0afa592ffb1c4a7cb7389fa1d5b0e0e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0ea0ee0c77d94ffe9ad2da96628a9075', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 694.451256] env[61439]: INFO nova.scheduler.client.report [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Deleted allocations for instance 0e994334-34bc-4c80-8825-8cfec714f81f [ 694.473774] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c32066ce-073f-4505-a972-1a626174f3c3 tempest-MigrationsAdminTest-1447096877 tempest-MigrationsAdminTest-1447096877-project-member] Lock "0e994334-34bc-4c80-8825-8cfec714f81f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.304s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 695.878308] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Acquiring lock "ad05c801-aff7-4a36-88e9-4994adfa3a8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 695.878773] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Lock "ad05c801-aff7-4a36-88e9-4994adfa3a8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 695.899236] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 695.977627] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 695.977936] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 695.981892] env[61439]: INFO nova.compute.claims [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 695.989142] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Successfully created port: 126cb36c-4690-4d88-bfab-0664961fe8ce {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 696.145987] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70a9a9b9-97e9-42d4-a3cd-4e43c64a2a68 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.156660] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8721f661-29c2-4d11-a17f-43cf61ce6f05 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.191744] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c70cd432-92e2-40e1-93c7-9a4dcc66f8ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.201421] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-473ebe9d-5dde-4195-88b9-059d6d227bfb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.217014] env[61439]: DEBUG nova.compute.provider_tree [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 696.232775] env[61439]: DEBUG nova.scheduler.client.report [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 696.248941] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 696.249500] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 696.278266] env[61439]: ERROR nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. [ 696.278266] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 696.278266] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 696.278266] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 696.278266] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 696.278266] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 696.278266] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 696.278266] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 696.278266] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 696.278266] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 696.278266] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 696.278266] env[61439]: ERROR nova.compute.manager raise self.value [ 696.278266] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 696.278266] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 696.278266] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 696.278266] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 696.278821] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 696.278821] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 696.278821] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. [ 696.278821] env[61439]: ERROR nova.compute.manager [ 696.278821] env[61439]: Traceback (most recent call last): [ 696.278821] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 696.278821] env[61439]: listener.cb(fileno) [ 696.278821] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 696.278821] env[61439]: result = function(*args, **kwargs) [ 696.278821] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 696.278821] env[61439]: return func(*args, **kwargs) [ 696.278821] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 696.278821] env[61439]: raise e [ 696.278821] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 696.278821] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 696.278821] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 696.278821] env[61439]: created_port_ids = self._update_ports_for_instance( [ 696.278821] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 696.278821] env[61439]: with excutils.save_and_reraise_exception(): [ 696.278821] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 696.278821] env[61439]: self.force_reraise() [ 696.278821] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 696.278821] env[61439]: raise self.value [ 696.278821] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 696.278821] env[61439]: updated_port = self._update_port( [ 696.278821] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 696.278821] env[61439]: _ensure_no_port_binding_failure(port) [ 696.278821] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 696.278821] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 696.279971] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. [ 696.279971] env[61439]: Removing descriptor: 24 [ 696.279971] env[61439]: ERROR nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Traceback (most recent call last): [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] yield resources [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self.driver.spawn(context, instance, image_meta, [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 696.279971] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] vm_ref = self.build_virtual_machine(instance, [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] vif_infos = vmwarevif.get_vif_info(self._session, [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] for vif in network_info: [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] return self._sync_wrapper(fn, *args, **kwargs) [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self.wait() [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self[:] = self._gt.wait() [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] return self._exit_event.wait() [ 696.280355] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] result = hub.switch() [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] return self.greenlet.switch() [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] result = function(*args, **kwargs) [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] return func(*args, **kwargs) [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] raise e [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] nwinfo = self.network_api.allocate_for_instance( [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 696.280709] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] created_port_ids = self._update_ports_for_instance( [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] with excutils.save_and_reraise_exception(): [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self.force_reraise() [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] raise self.value [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] updated_port = self._update_port( [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] _ensure_no_port_binding_failure(port) [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 696.281165] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] raise exception.PortBindingFailed(port_id=port['id']) [ 696.281812] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] nova.exception.PortBindingFailed: Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. [ 696.281812] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] [ 696.281812] env[61439]: INFO nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Terminating instance [ 696.281812] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Acquiring lock "refresh_cache-8f553865-312c-4b6a-9383-43f2cbcd0b5e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 696.281970] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Acquired lock "refresh_cache-8f553865-312c-4b6a-9383-43f2cbcd0b5e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 696.282168] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 696.299523] env[61439]: DEBUG nova.compute.utils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 696.301318] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 696.303621] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 696.313592] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 696.416134] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 696.446452] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 696.446635] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 696.446751] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 696.446922] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 696.447080] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 696.447234] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 696.447445] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 696.447598] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 696.447762] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 696.447925] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 696.448093] env[61439]: DEBUG nova.virt.hardware [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 696.448963] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eed8a60-a5ea-4e24-a9f2-0e004d1e4b9b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.459091] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3554b02-5fda-4465-b788-2b766e86d257 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 696.513111] env[61439]: DEBUG nova.policy [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8d0a84c83694e7d820bf8fa60bcef3f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '31e1deda33684cd08f46b5bf9e64b772', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 696.591789] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 697.076041] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 697.090267] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Releasing lock "refresh_cache-8f553865-312c-4b6a-9383-43f2cbcd0b5e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 697.091017] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 697.091255] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 697.091830] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a8b9a2db-d084-4c96-adbf-50a70e1f74b3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.102916] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c73bb544-df83-43f4-b0e3-69747dbbba92 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.130701] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8f553865-312c-4b6a-9383-43f2cbcd0b5e could not be found. [ 697.130835] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 697.131015] env[61439]: INFO nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 697.131425] env[61439]: DEBUG oslo.service.loopingcall [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 697.133649] env[61439]: DEBUG nova.compute.manager [-] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 697.133952] env[61439]: DEBUG nova.network.neutron [-] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 697.213438] env[61439]: DEBUG nova.network.neutron [-] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 697.228299] env[61439]: DEBUG nova.network.neutron [-] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 697.239571] env[61439]: INFO nova.compute.manager [-] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Took 0.11 seconds to deallocate network for instance. [ 697.241583] env[61439]: DEBUG nova.compute.claims [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 697.242077] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.242522] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 697.458313] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "8fe2bccd-5b46-4067-b72b-bdbf726c0155" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.458313] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "8fe2bccd-5b46-4067-b72b-bdbf726c0155" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 697.487924] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 697.511948] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-267f08d3-0b93-42d0-902e-dbdca26b8a8c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.522324] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b91176d0-bca0-47f7-9d47-a7016419e50f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.569177] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddb5990d-f324-4b70-9427-be7a10ec6918 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.584473] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cd21525-2f6d-409e-b60a-777392f2b7eb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.598943] env[61439]: DEBUG nova.compute.provider_tree [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.604664] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.617245] env[61439]: DEBUG nova.scheduler.client.report [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.649743] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.406s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 697.650419] env[61439]: ERROR nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Traceback (most recent call last): [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self.driver.spawn(context, instance, image_meta, [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] vm_ref = self.build_virtual_machine(instance, [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] vif_infos = vmwarevif.get_vif_info(self._session, [ 697.650419] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] for vif in network_info: [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] return self._sync_wrapper(fn, *args, **kwargs) [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self.wait() [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self[:] = self._gt.wait() [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] return self._exit_event.wait() [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] result = hub.switch() [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 697.650789] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] return self.greenlet.switch() [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] result = function(*args, **kwargs) [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] return func(*args, **kwargs) [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] raise e [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] nwinfo = self.network_api.allocate_for_instance( [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] created_port_ids = self._update_ports_for_instance( [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] with excutils.save_and_reraise_exception(): [ 697.651312] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] self.force_reraise() [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] raise self.value [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] updated_port = self._update_port( [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] _ensure_no_port_binding_failure(port) [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] raise exception.PortBindingFailed(port_id=port['id']) [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] nova.exception.PortBindingFailed: Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. [ 697.652573] env[61439]: ERROR nova.compute.manager [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] [ 697.653955] env[61439]: DEBUG nova.compute.utils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 697.655715] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Build of instance 8f553865-312c-4b6a-9383-43f2cbcd0b5e was re-scheduled: Binding failed for port 661fd24c-c2b1-48c2-9bed-c6b679f2ea0a, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 697.656084] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 697.656357] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Acquiring lock "refresh_cache-8f553865-312c-4b6a-9383-43f2cbcd0b5e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 697.656671] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Acquired lock "refresh_cache-8f553865-312c-4b6a-9383-43f2cbcd0b5e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 697.656671] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 697.661095] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.057s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 697.666262] env[61439]: INFO nova.compute.claims [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 697.828640] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "f347cbeb-a096-40db-8528-6cee24d390c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.828640] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "f347cbeb-a096-40db-8528-6cee24d390c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 697.847877] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 697.880195] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a939519a-8de9-4f34-8773-e8569f06e363 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.894022] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35af47e0-1f98-48a3-81e6-1b4221ab2139 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.929676] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ef4338e-776e-431b-9453-831b391ba0ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.941428] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13e3627b-562f-42ca-85d7-bb2175e1e721 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 697.948428] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.957365] env[61439]: DEBUG nova.compute.provider_tree [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.969358] env[61439]: DEBUG nova.scheduler.client.report [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.978981] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 697.996080] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 697.996503] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 697.998755] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.050s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 698.000576] env[61439]: INFO nova.compute.claims [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 698.040255] env[61439]: DEBUG nova.compute.utils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 698.044516] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Not allocating networking since 'none' was specified. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 698.052858] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 698.149821] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 698.188888] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 698.190352] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 698.190352] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 698.190352] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 698.190352] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 698.190352] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 698.191068] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 698.193058] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 698.193058] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 698.193058] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 698.193058] env[61439]: DEBUG nova.virt.hardware [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 698.193541] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f0aaa5e-01f9-4ef4-9ef4-7b49ffd0e34a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.202388] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eb8cab8-3f62-4d27-a70a-81ac1056f44c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.214216] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d8991ab-58e8-41c7-81f3-43964e6af61a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.218529] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a68e64d2-f4ea-4c71-a4fb-7fed3c1473a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.233325] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Instance VIF info [] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 698.240017] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Creating folder: Project (362a29354f714f0a8c002c5d933262cb). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 698.269350] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e4f33b4e-2813-4936-8a86-8f7ec53bb830 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.272178] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10f4b52c-7f11-45da-8e62-83c0296eabdd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.280944] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eaa2fd0-8268-46ec-b6f8-73d960773a7c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.286647] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Created folder: Project (362a29354f714f0a8c002c5d933262cb) in parent group-v221281. [ 698.286647] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Creating folder: Instances. Parent ref: group-v221292. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 698.287654] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ce8c6dee-edf7-4843-99f5-048038018a01 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.296525] env[61439]: DEBUG nova.compute.provider_tree [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 698.304686] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Created folder: Instances in parent group-v221292. [ 698.304946] env[61439]: DEBUG oslo.service.loopingcall [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 698.305966] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 698.305966] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-da904197-4fab-4be1-94e0-7fa781202b7e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.318720] env[61439]: DEBUG nova.scheduler.client.report [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 698.325905] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 698.325905] env[61439]: value = "task-987660" [ 698.325905] env[61439]: _type = "Task" [ 698.325905] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 698.333889] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987660, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 698.340780] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 698.341283] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 698.379682] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.394096] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Releasing lock "refresh_cache-8f553865-312c-4b6a-9383-43f2cbcd0b5e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 698.394819] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 698.395251] env[61439]: DEBUG nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 698.395251] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 698.418277] env[61439]: DEBUG nova.compute.utils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 698.421375] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 698.421565] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 698.436601] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 698.464474] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 698.476441] env[61439]: DEBUG nova.network.neutron [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.495021] env[61439]: INFO nova.compute.manager [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] [instance: 8f553865-312c-4b6a-9383-43f2cbcd0b5e] Took 0.10 seconds to deallocate network for instance. [ 698.547359] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 698.589511] env[61439]: DEBUG nova.policy [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b0ff0bbf7696481aa0a4d86c30d3dc7e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e62d95c2bfa4690a3d6662a057b6cd2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 698.597648] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 698.597648] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 698.597648] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 698.599694] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 698.599694] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 698.599694] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 698.599694] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 698.599694] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 698.599983] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 698.599983] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 698.599983] env[61439]: DEBUG nova.virt.hardware [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 698.600673] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a623446e-c729-4c8a-be95-fbfd0e24dee6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.611532] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a11771f0-8c29-4661-8e65-9db206aa69a7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.659095] env[61439]: INFO nova.scheduler.client.report [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Deleted allocations for instance 8f553865-312c-4b6a-9383-43f2cbcd0b5e [ 698.686594] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c15b1af6-4519-4126-92ce-d46601dad842 tempest-ServersWithSpecificFlavorTestJSON-945038412 tempest-ServersWithSpecificFlavorTestJSON-945038412-project-member] Lock "8f553865-312c-4b6a-9383-43f2cbcd0b5e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.343s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 698.828323] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Successfully created port: 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 698.839390] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987660, 'name': CreateVM_Task, 'duration_secs': 0.272029} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 698.839390] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 698.839390] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 698.839390] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 698.839563] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 698.839875] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cc72cb10-6106-4a22-96d0-16ae52be72a3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.846338] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for the task: (returnval){ [ 698.846338] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]521a2fdd-6b88-3d1d-20d0-00edb871790e" [ 698.846338] env[61439]: _type = "Task" [ 698.846338] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 698.855151] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]521a2fdd-6b88-3d1d-20d0-00edb871790e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 699.357426] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 699.357860] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 699.357926] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 700.537875] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Successfully created port: faf5d9cd-dd76-4f82-be96-e1413b39b9e5 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 702.273213] env[61439]: ERROR nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. [ 702.273213] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 702.273213] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 702.273213] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 702.273213] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 702.273213] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 702.273213] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 702.273213] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 702.273213] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 702.273213] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 702.273213] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 702.273213] env[61439]: ERROR nova.compute.manager raise self.value [ 702.273213] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 702.273213] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 702.273213] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 702.273213] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 702.273997] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 702.273997] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 702.273997] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. [ 702.273997] env[61439]: ERROR nova.compute.manager [ 702.273997] env[61439]: Traceback (most recent call last): [ 702.273997] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 702.273997] env[61439]: listener.cb(fileno) [ 702.273997] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 702.273997] env[61439]: result = function(*args, **kwargs) [ 702.273997] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 702.273997] env[61439]: return func(*args, **kwargs) [ 702.273997] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 702.273997] env[61439]: raise e [ 702.273997] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 702.273997] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 702.273997] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 702.273997] env[61439]: created_port_ids = self._update_ports_for_instance( [ 702.273997] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 702.273997] env[61439]: with excutils.save_and_reraise_exception(): [ 702.273997] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 702.273997] env[61439]: self.force_reraise() [ 702.273997] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 702.273997] env[61439]: raise self.value [ 702.273997] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 702.273997] env[61439]: updated_port = self._update_port( [ 702.273997] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 702.273997] env[61439]: _ensure_no_port_binding_failure(port) [ 702.273997] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 702.273997] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 702.275887] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. [ 702.275887] env[61439]: Removing descriptor: 20 [ 702.275887] env[61439]: ERROR nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Traceback (most recent call last): [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] yield resources [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self.driver.spawn(context, instance, image_meta, [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 702.275887] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] vm_ref = self.build_virtual_machine(instance, [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] vif_infos = vmwarevif.get_vif_info(self._session, [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] for vif in network_info: [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] return self._sync_wrapper(fn, *args, **kwargs) [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self.wait() [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self[:] = self._gt.wait() [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] return self._exit_event.wait() [ 702.276917] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] result = hub.switch() [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] return self.greenlet.switch() [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] result = function(*args, **kwargs) [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] return func(*args, **kwargs) [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] raise e [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] nwinfo = self.network_api.allocate_for_instance( [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 702.277403] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] created_port_ids = self._update_ports_for_instance( [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] with excutils.save_and_reraise_exception(): [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self.force_reraise() [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] raise self.value [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] updated_port = self._update_port( [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] _ensure_no_port_binding_failure(port) [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 702.277766] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] raise exception.PortBindingFailed(port_id=port['id']) [ 702.278360] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] nova.exception.PortBindingFailed: Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. [ 702.278360] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] [ 702.278360] env[61439]: INFO nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Terminating instance [ 702.283708] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Acquiring lock "refresh_cache-4a2752b8-156a-4ec2-810c-3d6e7bc8554e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 702.283708] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Acquired lock "refresh_cache-4a2752b8-156a-4ec2-810c-3d6e7bc8554e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 702.283708] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 702.388200] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 702.686621] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.706499] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Releasing lock "refresh_cache-4a2752b8-156a-4ec2-810c-3d6e7bc8554e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 702.707303] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 702.707303] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 702.707620] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-09c4023f-3526-47f2-b23a-1f8124f2acf3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.719694] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbf0d0dd-92f3-4eb6-967d-3388ec1ec267 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.743854] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4a2752b8-156a-4ec2-810c-3d6e7bc8554e could not be found. [ 702.744658] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 702.744658] env[61439]: INFO nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 702.744658] env[61439]: DEBUG oslo.service.loopingcall [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 702.744845] env[61439]: DEBUG nova.compute.manager [-] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 702.744910] env[61439]: DEBUG nova.network.neutron [-] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 702.762874] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "4527b287-d099-443c-a424-185d02054be0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 702.763344] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "4527b287-d099-443c-a424-185d02054be0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 702.781056] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 702.819879] env[61439]: DEBUG nova.network.neutron [-] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 702.828506] env[61439]: DEBUG nova.network.neutron [-] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.843918] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 702.844373] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 702.845868] env[61439]: INFO nova.compute.claims [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 702.848119] env[61439]: INFO nova.compute.manager [-] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Took 0.10 seconds to deallocate network for instance. [ 702.851527] env[61439]: DEBUG nova.compute.claims [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 702.851527] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 703.012960] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffba5a4c-544d-452b-b399-b91f682e1670 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.023252] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0035babb-cd0d-4d74-a836-872e4be02cf6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.059168] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ad199bd-cb43-439f-a75b-81ae3c798a12 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.067217] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f93837-73a1-4cd4-9cd7-50f1986b7860 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.081644] env[61439]: DEBUG nova.compute.provider_tree [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 703.091084] env[61439]: DEBUG nova.scheduler.client.report [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 703.110775] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 703.110775] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 703.111017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.260s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 703.155939] env[61439]: DEBUG nova.compute.utils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 703.157870] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Not allocating networking since 'none' was specified. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 703.167137] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 703.255371] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 703.284975] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 703.285319] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 703.285487] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 703.285676] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 703.285820] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 703.285963] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 703.286192] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 703.286372] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 703.286621] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 703.286753] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 703.286915] env[61439]: DEBUG nova.virt.hardware [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 703.288570] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a11cd512-21e8-47a5-bd66-e9da4574bbaa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.300063] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01e76e81-8734-4c56-9de4-77cd9114c11b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.307236] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fd22bfa-be23-48da-b98e-98b35fed293b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.326561] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Instance VIF info [] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 703.331494] env[61439]: DEBUG oslo.service.loopingcall [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 703.332921] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73a85a40-217f-4732-b896-b61456d5f71b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.336377] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4527b287-d099-443c-a424-185d02054be0] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 703.336618] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6eb4011a-c339-40f9-bb08-de19510903b4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.379414] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b30dd9ce-5399-4046-aab9-12f0da2d6949 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.381863] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 703.381863] env[61439]: value = "task-987662" [ 703.381863] env[61439]: _type = "Task" [ 703.381863] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 703.387585] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8d3f92-c0bc-491a-b204-841a72f04285 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.395257] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987662, 'name': CreateVM_Task} progress is 15%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 703.405112] env[61439]: DEBUG nova.compute.provider_tree [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 703.421022] env[61439]: DEBUG nova.scheduler.client.report [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 703.456011] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.344s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 703.456160] env[61439]: ERROR nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Traceback (most recent call last): [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self.driver.spawn(context, instance, image_meta, [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] vm_ref = self.build_virtual_machine(instance, [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] vif_infos = vmwarevif.get_vif_info(self._session, [ 703.456160] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] for vif in network_info: [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] return self._sync_wrapper(fn, *args, **kwargs) [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self.wait() [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self[:] = self._gt.wait() [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] return self._exit_event.wait() [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] result = hub.switch() [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 703.456501] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] return self.greenlet.switch() [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] result = function(*args, **kwargs) [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] return func(*args, **kwargs) [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] raise e [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] nwinfo = self.network_api.allocate_for_instance( [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] created_port_ids = self._update_ports_for_instance( [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] with excutils.save_and_reraise_exception(): [ 703.456881] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] self.force_reraise() [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] raise self.value [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] updated_port = self._update_port( [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] _ensure_no_port_binding_failure(port) [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] raise exception.PortBindingFailed(port_id=port['id']) [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] nova.exception.PortBindingFailed: Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. [ 703.457273] env[61439]: ERROR nova.compute.manager [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] [ 703.457592] env[61439]: DEBUG nova.compute.utils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 703.458762] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Build of instance 4a2752b8-156a-4ec2-810c-3d6e7bc8554e was re-scheduled: Binding failed for port 126cb36c-4690-4d88-bfab-0664961fe8ce, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 703.459242] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 703.459494] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Acquiring lock "refresh_cache-4a2752b8-156a-4ec2-810c-3d6e7bc8554e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 703.459650] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Acquired lock "refresh_cache-4a2752b8-156a-4ec2-810c-3d6e7bc8554e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 703.460221] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 703.549987] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.889710] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.902132] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987662, 'name': CreateVM_Task, 'duration_secs': 0.283557} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 703.902888] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4527b287-d099-443c-a424-185d02054be0] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 703.903430] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Releasing lock "refresh_cache-4a2752b8-156a-4ec2-810c-3d6e7bc8554e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 703.903651] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 703.903818] env[61439]: DEBUG nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 703.903981] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 703.905906] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 703.906442] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 703.906821] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 703.907478] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2bc9c0bb-b629-4bd4-92ab-8e0b882a87a3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.912766] env[61439]: DEBUG oslo_vmware.api [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for the task: (returnval){ [ 703.912766] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52af3b01-8fb5-a8a0-0f74-19711b555ba9" [ 703.912766] env[61439]: _type = "Task" [ 703.912766] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 703.922180] env[61439]: DEBUG oslo_vmware.api [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52af3b01-8fb5-a8a0-0f74-19711b555ba9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 703.942864] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.952594] env[61439]: DEBUG nova.network.neutron [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.963212] env[61439]: INFO nova.compute.manager [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] [instance: 4a2752b8-156a-4ec2-810c-3d6e7bc8554e] Took 0.06 seconds to deallocate network for instance. [ 704.090629] env[61439]: INFO nova.scheduler.client.report [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Deleted allocations for instance 4a2752b8-156a-4ec2-810c-3d6e7bc8554e [ 704.116653] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2d5bcf9a-ed4e-40a8-af8e-31ba992cabef tempest-ImagesNegativeTestJSON-1252542843 tempest-ImagesNegativeTestJSON-1252542843-project-member] Lock "4a2752b8-156a-4ec2-810c-3d6e7bc8554e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.328s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 704.427730] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 704.428218] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 704.428353] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 704.595653] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 704.595986] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 704.618314] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 704.694518] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 704.697073] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 704.697792] env[61439]: INFO nova.compute.claims [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 704.824315] env[61439]: ERROR nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. [ 704.824315] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 704.824315] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 704.824315] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 704.824315] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 704.824315] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 704.824315] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 704.824315] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 704.824315] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 704.824315] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 704.824315] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 704.824315] env[61439]: ERROR nova.compute.manager raise self.value [ 704.824315] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 704.824315] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 704.824315] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 704.824315] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 704.826227] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 704.826227] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 704.826227] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. [ 704.826227] env[61439]: ERROR nova.compute.manager [ 704.826227] env[61439]: Traceback (most recent call last): [ 704.826227] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 704.826227] env[61439]: listener.cb(fileno) [ 704.826227] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 704.826227] env[61439]: result = function(*args, **kwargs) [ 704.826227] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 704.826227] env[61439]: return func(*args, **kwargs) [ 704.826227] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 704.826227] env[61439]: raise e [ 704.826227] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 704.826227] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 704.826227] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 704.826227] env[61439]: created_port_ids = self._update_ports_for_instance( [ 704.826227] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 704.826227] env[61439]: with excutils.save_and_reraise_exception(): [ 704.826227] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 704.826227] env[61439]: self.force_reraise() [ 704.826227] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 704.826227] env[61439]: raise self.value [ 704.826227] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 704.826227] env[61439]: updated_port = self._update_port( [ 704.826227] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 704.826227] env[61439]: _ensure_no_port_binding_failure(port) [ 704.826227] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 704.826227] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 704.828210] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. [ 704.828210] env[61439]: Removing descriptor: 22 [ 704.828210] env[61439]: ERROR nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Traceback (most recent call last): [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] yield resources [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self.driver.spawn(context, instance, image_meta, [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 704.828210] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] vm_ref = self.build_virtual_machine(instance, [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] vif_infos = vmwarevif.get_vif_info(self._session, [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] for vif in network_info: [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] return self._sync_wrapper(fn, *args, **kwargs) [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self.wait() [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self[:] = self._gt.wait() [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] return self._exit_event.wait() [ 704.828625] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] result = hub.switch() [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] return self.greenlet.switch() [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] result = function(*args, **kwargs) [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] return func(*args, **kwargs) [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] raise e [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] nwinfo = self.network_api.allocate_for_instance( [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 704.829027] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] created_port_ids = self._update_ports_for_instance( [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] with excutils.save_and_reraise_exception(): [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self.force_reraise() [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] raise self.value [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] updated_port = self._update_port( [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] _ensure_no_port_binding_failure(port) [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 704.829483] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] raise exception.PortBindingFailed(port_id=port['id']) [ 704.829921] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] nova.exception.PortBindingFailed: Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. [ 704.829921] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] [ 704.829921] env[61439]: INFO nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Terminating instance [ 704.829921] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Acquiring lock "refresh_cache-ad05c801-aff7-4a36-88e9-4994adfa3a8c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 704.829921] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Acquired lock "refresh_cache-ad05c801-aff7-4a36-88e9-4994adfa3a8c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 704.829921] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 704.877844] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 704.905974] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef3edad1-4bae-4dae-8b1b-6eec95aadb3f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.915763] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94859037-cf55-476d-9e9e-cc01c4fc0345 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.957400] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38273279-0a71-442d-a90e-b4d150f36d20 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.968479] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fef3462-fb99-4556-8d31-e47904063633 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.986815] env[61439]: DEBUG nova.compute.provider_tree [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 705.003848] env[61439]: DEBUG nova.scheduler.client.report [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 705.024720] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.328s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 705.025540] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 705.098764] env[61439]: DEBUG nova.compute.utils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 705.100476] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 705.100681] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 705.115754] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 705.332753] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 705.332753] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 705.332753] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 705.337521] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 705.337521] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 705.337521] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 705.337521] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 705.337521] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 705.339061] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 705.339061] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 705.339061] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 705.339061] env[61439]: DEBUG nova.virt.hardware [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 705.339061] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67e898a4-8568-4ab5-8284-0f08bf0d7725 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.340503] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-803750ac-875d-4633-ac45-6a4cb1efa7f6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.340503] env[61439]: DEBUG nova.policy [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e64ca57e567146098521cd7356b9e3e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e1db803f0ff4f29bb70e0a0d94c57e0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 705.402259] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.427831] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Releasing lock "refresh_cache-ad05c801-aff7-4a36-88e9-4994adfa3a8c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 705.427831] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 705.427831] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 705.428148] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e79cde0d-1709-4337-9cf8-8b7a83b5fe5e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.441763] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-178583d8-74b4-481e-9294-67b8b91c7bcf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.467493] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad05c801-aff7-4a36-88e9-4994adfa3a8c could not be found. [ 705.467787] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 705.468025] env[61439]: INFO nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 705.468291] env[61439]: DEBUG oslo.service.loopingcall [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 705.468577] env[61439]: DEBUG nova.compute.manager [-] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 705.468677] env[61439]: DEBUG nova.network.neutron [-] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 705.571855] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Acquiring lock "de83001c-11a6-4c0d-86c9-9a5e582595bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 705.572131] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Lock "de83001c-11a6-4c0d-86c9-9a5e582595bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 705.573814] env[61439]: DEBUG nova.network.neutron [-] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 705.587052] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 705.590398] env[61439]: DEBUG nova.network.neutron [-] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.610239] env[61439]: INFO nova.compute.manager [-] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Took 0.14 seconds to deallocate network for instance. [ 705.612251] env[61439]: DEBUG nova.compute.claims [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 705.612251] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 705.612440] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 705.679538] env[61439]: ERROR nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. [ 705.679538] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 705.679538] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 705.679538] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 705.679538] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 705.679538] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 705.679538] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 705.679538] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 705.679538] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.679538] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 705.679538] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.679538] env[61439]: ERROR nova.compute.manager raise self.value [ 705.679538] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 705.679538] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 705.679538] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.679538] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 705.681400] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.681400] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 705.681400] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. [ 705.681400] env[61439]: ERROR nova.compute.manager [ 705.681400] env[61439]: Traceback (most recent call last): [ 705.681400] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 705.681400] env[61439]: listener.cb(fileno) [ 705.681400] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 705.681400] env[61439]: result = function(*args, **kwargs) [ 705.681400] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 705.681400] env[61439]: return func(*args, **kwargs) [ 705.681400] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 705.681400] env[61439]: raise e [ 705.681400] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 705.681400] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 705.681400] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 705.681400] env[61439]: created_port_ids = self._update_ports_for_instance( [ 705.681400] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 705.681400] env[61439]: with excutils.save_and_reraise_exception(): [ 705.681400] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.681400] env[61439]: self.force_reraise() [ 705.681400] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.681400] env[61439]: raise self.value [ 705.681400] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 705.681400] env[61439]: updated_port = self._update_port( [ 705.681400] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.681400] env[61439]: _ensure_no_port_binding_failure(port) [ 705.681400] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.681400] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 705.682319] env[61439]: nova.exception.PortBindingFailed: Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. [ 705.682319] env[61439]: Removing descriptor: 23 [ 705.682319] env[61439]: ERROR nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Traceback (most recent call last): [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] yield resources [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self.driver.spawn(context, instance, image_meta, [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 705.682319] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] vm_ref = self.build_virtual_machine(instance, [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] vif_infos = vmwarevif.get_vif_info(self._session, [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] for vif in network_info: [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] return self._sync_wrapper(fn, *args, **kwargs) [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self.wait() [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self[:] = self._gt.wait() [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] return self._exit_event.wait() [ 705.682704] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] result = hub.switch() [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] return self.greenlet.switch() [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] result = function(*args, **kwargs) [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] return func(*args, **kwargs) [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] raise e [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] nwinfo = self.network_api.allocate_for_instance( [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 705.683114] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] created_port_ids = self._update_ports_for_instance( [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] with excutils.save_and_reraise_exception(): [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self.force_reraise() [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] raise self.value [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] updated_port = self._update_port( [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] _ensure_no_port_binding_failure(port) [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.683531] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] raise exception.PortBindingFailed(port_id=port['id']) [ 705.686847] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] nova.exception.PortBindingFailed: Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. [ 705.686847] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] [ 705.686847] env[61439]: INFO nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Terminating instance [ 705.686847] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "refresh_cache-f347cbeb-a096-40db-8528-6cee24d390c2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 705.686847] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquired lock "refresh_cache-f347cbeb-a096-40db-8528-6cee24d390c2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 705.686847] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 705.689773] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 705.776947] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 705.815109] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b64bab67-79cf-4964-b9eb-c07fdfacd303 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.824349] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f01dc363-2674-4378-ae7b-66fceaa27f15 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.858831] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86595194-211f-42fb-bd7c-9fd32659faca {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.866695] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66509915-024a-46b9-81a4-bcc7394345a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.880860] env[61439]: DEBUG nova.compute.provider_tree [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 705.893991] env[61439]: DEBUG nova.scheduler.client.report [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 705.913504] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.301s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 705.914164] env[61439]: ERROR nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Traceback (most recent call last): [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self.driver.spawn(context, instance, image_meta, [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] vm_ref = self.build_virtual_machine(instance, [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] vif_infos = vmwarevif.get_vif_info(self._session, [ 705.914164] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] for vif in network_info: [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] return self._sync_wrapper(fn, *args, **kwargs) [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self.wait() [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self[:] = self._gt.wait() [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] return self._exit_event.wait() [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] result = hub.switch() [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.914702] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] return self.greenlet.switch() [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] result = function(*args, **kwargs) [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] return func(*args, **kwargs) [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] raise e [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] nwinfo = self.network_api.allocate_for_instance( [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] created_port_ids = self._update_ports_for_instance( [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] with excutils.save_and_reraise_exception(): [ 705.915421] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] self.force_reraise() [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] raise self.value [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] updated_port = self._update_port( [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] _ensure_no_port_binding_failure(port) [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] raise exception.PortBindingFailed(port_id=port['id']) [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] nova.exception.PortBindingFailed: Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. [ 705.915792] env[61439]: ERROR nova.compute.manager [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] [ 705.916109] env[61439]: DEBUG nova.compute.utils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 705.916745] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.226s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 705.920592] env[61439]: INFO nova.compute.claims [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 705.921485] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Build of instance ad05c801-aff7-4a36-88e9-4994adfa3a8c was re-scheduled: Binding failed for port 4fb5161b-a3fe-4af0-9a2c-e5502acf4bfd, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 705.922295] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 705.922564] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Acquiring lock "refresh_cache-ad05c801-aff7-4a36-88e9-4994adfa3a8c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 705.922803] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Acquired lock "refresh_cache-ad05c801-aff7-4a36-88e9-4994adfa3a8c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 705.923046] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 705.999710] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.129019] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbd6aa0c-a936-4ad1-b060-3ffdcb96494e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.139319] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc80ea50-3f21-4463-9f81-4cfb3af2f860 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.176751] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa3e9c64-0258-4306-8494-10e2c8458ff4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.188057] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6127236-6a2c-4cff-b533-8e789b081726 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.208831] env[61439]: DEBUG nova.compute.provider_tree [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 706.222469] env[61439]: DEBUG nova.scheduler.client.report [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 706.244492] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 706.245560] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 706.308492] env[61439]: DEBUG nova.compute.utils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 706.311244] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 706.311880] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 706.333687] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 706.440019] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 706.470648] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 706.471097] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 706.471862] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 706.472348] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 706.472668] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 706.472733] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 706.473221] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 706.473507] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 706.473735] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 706.473947] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 706.474382] env[61439]: DEBUG nova.virt.hardware [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 706.476333] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c6d51c-d6b7-4492-a09a-4ec666e8ef54 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.488269] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc851b32-d01f-4d45-92ce-52209f968828 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.512155] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.527711] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Releasing lock "refresh_cache-f347cbeb-a096-40db-8528-6cee24d390c2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 706.528206] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 706.529072] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 706.529072] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5281361a-875f-4059-934f-6e604ea91b39 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.541264] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ee4c07-8510-4360-acf5-491467251f73 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.571642] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f347cbeb-a096-40db-8528-6cee24d390c2 could not be found. [ 706.571642] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 706.571642] env[61439]: INFO nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 706.571898] env[61439]: DEBUG oslo.service.loopingcall [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 706.572781] env[61439]: DEBUG nova.compute.manager [-] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 706.572781] env[61439]: DEBUG nova.network.neutron [-] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 706.654193] env[61439]: DEBUG nova.network.neutron [-] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.669663] env[61439]: DEBUG nova.policy [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b94cc1a2817f4ffcbabd083bca41bfc2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e655d07206d94e0b8c65207d43edcdf4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 706.672976] env[61439]: DEBUG nova.network.neutron [-] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.686708] env[61439]: INFO nova.compute.manager [-] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Took 0.11 seconds to deallocate network for instance. [ 706.691861] env[61439]: DEBUG nova.compute.claims [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 706.692694] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 706.692694] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 706.704018] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.726119] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Releasing lock "refresh_cache-ad05c801-aff7-4a36-88e9-4994adfa3a8c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 706.726356] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 706.726536] env[61439]: DEBUG nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 706.726699] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 706.808968] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Acquiring lock "ec1394bf-266b-4830-8996-b6221c47c2e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 706.809380] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Lock "ec1394bf-266b-4830-8996-b6221c47c2e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 706.818558] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.828808] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 706.833768] env[61439]: DEBUG nova.network.neutron [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.841691] env[61439]: INFO nova.compute.manager [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] [instance: ad05c801-aff7-4a36-88e9-4994adfa3a8c] Took 0.11 seconds to deallocate network for instance. [ 706.952026] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 706.998101] env[61439]: INFO nova.scheduler.client.report [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Deleted allocations for instance ad05c801-aff7-4a36-88e9-4994adfa3a8c [ 707.026145] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67e45458-d601-4483-b857-adae56b8ac2c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.036803] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f13ad252-183a-488f-96a2-7e2ddb5d5a4c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.043395] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ac788df8-e141-4402-aa21-8f517cf257a9 tempest-ServersTestJSON-1395993600 tempest-ServersTestJSON-1395993600-project-member] Lock "ad05c801-aff7-4a36-88e9-4994adfa3a8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.165s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 707.080680] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9595fe8-6e9c-46fb-8c82-2483da328b8c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.090156] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b83752f6-07a4-430d-9b10-447bb4331ade {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.106761] env[61439]: DEBUG nova.compute.provider_tree [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.119902] env[61439]: DEBUG nova.scheduler.client.report [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.142041] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.449s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 707.142584] env[61439]: ERROR nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Traceback (most recent call last): [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self.driver.spawn(context, instance, image_meta, [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] vm_ref = self.build_virtual_machine(instance, [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] vif_infos = vmwarevif.get_vif_info(self._session, [ 707.142584] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] for vif in network_info: [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] return self._sync_wrapper(fn, *args, **kwargs) [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self.wait() [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self[:] = self._gt.wait() [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] return self._exit_event.wait() [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] result = hub.switch() [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 707.142983] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] return self.greenlet.switch() [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] result = function(*args, **kwargs) [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] return func(*args, **kwargs) [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] raise e [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] nwinfo = self.network_api.allocate_for_instance( [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] created_port_ids = self._update_ports_for_instance( [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] with excutils.save_and_reraise_exception(): [ 707.143839] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] self.force_reraise() [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] raise self.value [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] updated_port = self._update_port( [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] _ensure_no_port_binding_failure(port) [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] raise exception.PortBindingFailed(port_id=port['id']) [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] nova.exception.PortBindingFailed: Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. [ 707.144443] env[61439]: ERROR nova.compute.manager [instance: f347cbeb-a096-40db-8528-6cee24d390c2] [ 707.144962] env[61439]: DEBUG nova.compute.utils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 707.144962] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.193s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 707.146439] env[61439]: INFO nova.compute.claims [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 707.150168] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Build of instance f347cbeb-a096-40db-8528-6cee24d390c2 was re-scheduled: Binding failed for port faf5d9cd-dd76-4f82-be96-e1413b39b9e5, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 707.150168] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 707.150168] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "refresh_cache-f347cbeb-a096-40db-8528-6cee24d390c2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 707.150399] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquired lock "refresh_cache-f347cbeb-a096-40db-8528-6cee24d390c2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 707.150481] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 707.201189] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Successfully created port: 2fcf6fc6-cc25-4947-83e2-8491785cb7e2 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 707.227040] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.383494] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e48d3fe-549f-459e-b0f6-f1fd7de256c5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.389834] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d192477-fc77-4eea-840d-cf6b4ddb4332 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.431771] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27466aec-61a2-4643-a9c4-5cc5d2f96b5d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.439336] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caefd01c-0006-4d2c-9f57-90af50f8a09f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.454127] env[61439]: DEBUG nova.compute.provider_tree [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.470330] env[61439]: DEBUG nova.scheduler.client.report [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.501426] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 707.501724] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 707.567104] env[61439]: DEBUG nova.compute.utils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 707.568617] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 707.568824] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 707.582020] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 707.659332] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 707.687076] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 707.687337] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 707.687500] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 707.687687] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 707.687836] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 707.688069] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 707.688301] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 707.688472] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 707.688640] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 707.688804] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 707.688971] env[61439]: DEBUG nova.virt.hardware [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 707.690132] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-184e2713-f1c5-4a7d-9a3f-29045c77a6f6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.699402] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6273fe98-3baf-4d35-bbd8-b7228f88b746 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.919332] env[61439]: DEBUG nova.policy [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2681b9908124b12a9b39c688cf31f2a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cd42a914c12b4f0c952fb8b911ea0e8a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 708.063137] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.077955] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Releasing lock "refresh_cache-f347cbeb-a096-40db-8528-6cee24d390c2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 708.078255] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 708.078440] env[61439]: DEBUG nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 708.078603] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 708.150760] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.159888] env[61439]: DEBUG nova.network.neutron [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.176471] env[61439]: INFO nova.compute.manager [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: f347cbeb-a096-40db-8528-6cee24d390c2] Took 0.10 seconds to deallocate network for instance. [ 708.290884] env[61439]: INFO nova.scheduler.client.report [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Deleted allocations for instance f347cbeb-a096-40db-8528-6cee24d390c2 [ 708.314494] env[61439]: DEBUG oslo_concurrency.lockutils [None req-32ffa299-2ab2-41ba-b128-78b5ab0cda25 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "f347cbeb-a096-40db-8528-6cee24d390c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.486s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 708.586440] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Successfully created port: 6c845135-77bc-40b8-ab5d-e008de23383a {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 709.088288] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Acquiring lock "3a0fe362-53a6-48a4-9fdd-cc81586bacfc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 709.088555] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Lock "3a0fe362-53a6-48a4-9fdd-cc81586bacfc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 709.103447] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 709.163453] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 709.163898] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 709.165336] env[61439]: INFO nova.compute.claims [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 709.350179] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d290670f-0946-4e17-bb3a-266c650f14d8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.359426] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0436b27-d802-4ddd-adf0-2f828900c4d9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.401629] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bd966d0-6177-46be-ae2c-e190c3551481 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.411820] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-472b61fb-89cc-4237-8daf-51498c717919 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.425835] env[61439]: DEBUG nova.compute.provider_tree [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 709.438153] env[61439]: DEBUG nova.scheduler.client.report [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 709.456802] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 709.456802] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 709.521805] env[61439]: DEBUG nova.compute.utils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 709.526038] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 709.526038] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 709.543666] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 709.664674] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 709.690540] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Successfully created port: 17a69317-ee7c-4a40-a06a-39537a1c4a67 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 709.697981] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 709.714035] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 709.714035] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 709.714035] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 709.716664] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 709.716838] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 709.716989] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 709.717405] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 709.717600] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 709.717772] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 709.717930] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 709.718169] env[61439]: DEBUG nova.virt.hardware [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 709.718984] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1743433e-d1a0-43d2-a562-ad639db14049 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.729139] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f51dd03-e080-4c68-a2d9-b3b679991b62 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.738364] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 709.738544] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 709.738731] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 709.759984] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 709.759984] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 709.759984] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4527b287-d099-443c-a424-185d02054be0] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 709.760397] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 709.760633] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 709.760795] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 709.760949] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 709.761118] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 709.761590] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 709.761794] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 709.761984] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 709.772169] env[61439]: DEBUG nova.policy [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd49d4ec89a0b466e8e292bc9bffc440f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'be173f6f52a14206b92e812f72c87d9b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 709.778045] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 709.778283] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 709.778453] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 709.778608] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 709.779676] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbb1aa45-4c17-40b9-9d6a-87bde8bd16a1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.790284] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a01fdc3-3b72-4e5d-ac8d-67eed72cf936 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.806752] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1812cc1-2471-4645-b7ea-b7f85aff9a92 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.814296] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be768362-b98c-457d-9a8a-17b54f2d6be3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.847963] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181576MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 709.848145] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 709.848350] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 709.930578] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance a41cb33f-8340-4b15-b19d-0a7b9396eae7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 709.930761] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 8fe2bccd-5b46-4067-b72b-bdbf726c0155 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 709.930959] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4527b287-d099-443c-a424-185d02054be0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 709.931107] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 709.931192] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance de83001c-11a6-4c0d-86c9-9a5e582595bb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 709.931367] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance ec1394bf-266b-4830-8996-b6221c47c2e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 709.931443] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 3a0fe362-53a6-48a4-9fdd-cc81586bacfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 709.931622] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 709.931763] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 710.068074] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e5d92db-07fc-45f5-a4c5-105335775fb0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.077039] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b779ccc3-5898-4b0d-8a2b-45c442ed62b7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.112108] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ad59e8-0458-462b-be29-2c2bf05a094e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.120133] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86850499-703c-4335-83f5-5f4ab1c6a3d6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.134586] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 710.147496] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 710.167728] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 710.168112] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.320s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 710.608444] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 710.608589] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 710.608882] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 710.608882] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 711.201770] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 711.201993] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 711.978815] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Successfully created port: 547f811a-085a-4dec-b008-82e96ca7b6be {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 714.155135] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Acquiring lock "3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.155492] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Lock "3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 714.167753] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 714.231016] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.231016] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 714.231458] env[61439]: INFO nova.compute.claims [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 714.437087] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f565715-8cf6-40f9-a71e-6fc974de84ec {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.448612] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8301a11-963d-4168-a2de-30c405b7600f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.494249] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98810027-3906-4af1-816b-3ad82a615b87 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.505089] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b27bd714-560b-42ed-bff4-8ae8cbab1fe3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.520670] env[61439]: DEBUG nova.compute.provider_tree [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.533798] env[61439]: DEBUG nova.scheduler.client.report [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 714.558788] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 714.559474] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 714.612944] env[61439]: DEBUG nova.compute.utils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 714.615030] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 714.615171] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 714.628166] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 714.733573] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 714.771866] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 714.773554] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 714.773554] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 714.773784] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 714.773784] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 714.773901] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 714.774092] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 714.774323] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 714.778311] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 714.778509] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 714.778690] env[61439]: DEBUG nova.virt.hardware [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 714.779622] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1459c99b-df45-4106-a8e1-7c0eebef6ecd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.788865] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11168f0f-da5e-4181-b12a-501150e8a66d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.813096] env[61439]: ERROR nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. [ 714.813096] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 714.813096] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 714.813096] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 714.813096] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.813096] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 714.813096] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.813096] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 714.813096] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.813096] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 714.813096] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.813096] env[61439]: ERROR nova.compute.manager raise self.value [ 714.813096] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.813096] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 714.813096] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.813096] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 714.813795] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.813795] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 714.813795] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. [ 714.813795] env[61439]: ERROR nova.compute.manager [ 714.813795] env[61439]: Traceback (most recent call last): [ 714.813795] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 714.813795] env[61439]: listener.cb(fileno) [ 714.813795] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 714.813795] env[61439]: result = function(*args, **kwargs) [ 714.813795] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 714.813795] env[61439]: return func(*args, **kwargs) [ 714.813795] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 714.813795] env[61439]: raise e [ 714.813795] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 714.813795] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 714.813795] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.813795] env[61439]: created_port_ids = self._update_ports_for_instance( [ 714.813795] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.813795] env[61439]: with excutils.save_and_reraise_exception(): [ 714.813795] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.813795] env[61439]: self.force_reraise() [ 714.813795] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.813795] env[61439]: raise self.value [ 714.813795] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.813795] env[61439]: updated_port = self._update_port( [ 714.813795] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.813795] env[61439]: _ensure_no_port_binding_failure(port) [ 714.813795] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.813795] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 714.814776] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. [ 714.814776] env[61439]: Removing descriptor: 20 [ 714.814776] env[61439]: ERROR nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Traceback (most recent call last): [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] yield resources [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self.driver.spawn(context, instance, image_meta, [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self._vmops.spawn(context, instance, image_meta, injected_files, [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 714.814776] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] vm_ref = self.build_virtual_machine(instance, [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] vif_infos = vmwarevif.get_vif_info(self._session, [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] for vif in network_info: [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] return self._sync_wrapper(fn, *args, **kwargs) [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self.wait() [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self[:] = self._gt.wait() [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] return self._exit_event.wait() [ 714.815213] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] result = hub.switch() [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] return self.greenlet.switch() [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] result = function(*args, **kwargs) [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] return func(*args, **kwargs) [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] raise e [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] nwinfo = self.network_api.allocate_for_instance( [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.815647] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] created_port_ids = self._update_ports_for_instance( [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] with excutils.save_and_reraise_exception(): [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self.force_reraise() [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] raise self.value [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] updated_port = self._update_port( [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] _ensure_no_port_binding_failure(port) [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.816089] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] raise exception.PortBindingFailed(port_id=port['id']) [ 714.816552] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] nova.exception.PortBindingFailed: Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. [ 714.816552] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] [ 714.816552] env[61439]: INFO nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Terminating instance [ 714.816552] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "refresh_cache-ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 714.816701] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquired lock "refresh_cache-ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 714.816801] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 714.847258] env[61439]: DEBUG nova.policy [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '495e4a012f524a31a7edbfc5c231eb5f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e0140698aef43e4a744512d99c27d87', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 714.924817] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 714.935691] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "c9aebd15-4afa-47b4-9072-40ea371c2857" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.935691] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "c9aebd15-4afa-47b4-9072-40ea371c2857" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 714.950209] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 715.018429] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 715.018723] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 715.020218] env[61439]: INFO nova.compute.claims [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 715.211383] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1208a0ed-40a3-4384-bdd5-4cd1e39e50eb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.219886] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fc8418b-2ff7-459a-9f09-5ac9b98a4562 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.256277] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59fc9926-0e7a-4619-a270-fd2380ae1e07 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.264588] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ace7eb2-079b-49c0-bb44-71a1f857440e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.279657] env[61439]: DEBUG nova.compute.provider_tree [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 715.288604] env[61439]: DEBUG nova.scheduler.client.report [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 715.304312] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 715.304844] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 715.343529] env[61439]: DEBUG nova.compute.utils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 715.348020] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 715.348020] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 715.358728] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 715.450943] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 715.483345] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 715.483708] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 715.483795] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 715.483905] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 715.484871] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 715.485258] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 715.485634] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 715.485911] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 715.486126] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 715.486306] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 715.486484] env[61439]: DEBUG nova.virt.hardware [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 715.489176] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fcedd58-0457-45d7-bd51-0a2a1f807995 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.497659] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5c2c903-a06a-4f9f-ad77-1baf83df5a8e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.754780] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.777877] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Releasing lock "refresh_cache-ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 715.777877] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 715.777877] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 715.777877] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cf0c28d9-dd1d-484d-a2ac-78b2df245789 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.790308] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dabfb3ce-4353-4550-9c8a-f4f78583292a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.804457] env[61439]: DEBUG nova.policy [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b861ada4972f4431b0b9bd46ae21f7cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16074166244d449b99488fc24f4f3d74', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 715.820820] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197 could not be found. [ 715.821345] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 715.821950] env[61439]: INFO nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Took 0.04 seconds to destroy the instance on the hypervisor. [ 715.823023] env[61439]: DEBUG oslo.service.loopingcall [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 715.823023] env[61439]: DEBUG nova.compute.manager [-] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 715.823023] env[61439]: DEBUG nova.network.neutron [-] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 715.922043] env[61439]: DEBUG nova.network.neutron [-] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.931977] env[61439]: DEBUG nova.network.neutron [-] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.942444] env[61439]: INFO nova.compute.manager [-] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Took 0.12 seconds to deallocate network for instance. [ 715.945019] env[61439]: DEBUG nova.compute.claims [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 715.945539] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 715.945539] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 716.167255] env[61439]: ERROR nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. [ 716.167255] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 716.167255] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 716.167255] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 716.167255] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 716.167255] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 716.167255] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 716.167255] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 716.167255] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 716.167255] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 716.167255] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 716.167255] env[61439]: ERROR nova.compute.manager raise self.value [ 716.167255] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 716.167255] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 716.167255] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 716.167255] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 716.168496] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 716.168496] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 716.168496] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. [ 716.168496] env[61439]: ERROR nova.compute.manager [ 716.168496] env[61439]: Traceback (most recent call last): [ 716.168496] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 716.168496] env[61439]: listener.cb(fileno) [ 716.168496] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 716.168496] env[61439]: result = function(*args, **kwargs) [ 716.168496] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 716.168496] env[61439]: return func(*args, **kwargs) [ 716.168496] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 716.168496] env[61439]: raise e [ 716.168496] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 716.168496] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 716.168496] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 716.168496] env[61439]: created_port_ids = self._update_ports_for_instance( [ 716.168496] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 716.168496] env[61439]: with excutils.save_and_reraise_exception(): [ 716.168496] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 716.168496] env[61439]: self.force_reraise() [ 716.168496] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 716.168496] env[61439]: raise self.value [ 716.168496] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 716.168496] env[61439]: updated_port = self._update_port( [ 716.168496] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 716.168496] env[61439]: _ensure_no_port_binding_failure(port) [ 716.168496] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 716.168496] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 716.169345] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. [ 716.169345] env[61439]: Removing descriptor: 24 [ 716.169345] env[61439]: ERROR nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Traceback (most recent call last): [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] yield resources [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self.driver.spawn(context, instance, image_meta, [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 716.169345] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] vm_ref = self.build_virtual_machine(instance, [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] vif_infos = vmwarevif.get_vif_info(self._session, [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] for vif in network_info: [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] return self._sync_wrapper(fn, *args, **kwargs) [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self.wait() [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self[:] = self._gt.wait() [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] return self._exit_event.wait() [ 716.169740] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] result = hub.switch() [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] return self.greenlet.switch() [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] result = function(*args, **kwargs) [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] return func(*args, **kwargs) [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] raise e [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] nwinfo = self.network_api.allocate_for_instance( [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 716.170177] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] created_port_ids = self._update_ports_for_instance( [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] with excutils.save_and_reraise_exception(): [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self.force_reraise() [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] raise self.value [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] updated_port = self._update_port( [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] _ensure_no_port_binding_failure(port) [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 716.170555] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] raise exception.PortBindingFailed(port_id=port['id']) [ 716.170899] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] nova.exception.PortBindingFailed: Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. [ 716.170899] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] [ 716.170899] env[61439]: INFO nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Terminating instance [ 716.170899] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Acquiring lock "refresh_cache-de83001c-11a6-4c0d-86c9-9a5e582595bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 716.170899] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Acquired lock "refresh_cache-de83001c-11a6-4c0d-86c9-9a5e582595bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 716.171856] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.181962] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61178244-80d7-4d67-b1fe-25029669ef3d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.191905] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af856080-f52b-4044-97dc-5d9c2e9b89c1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.225862] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63fe5249-93d5-4dd5-9dac-bd78b398b242 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.233410] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddb44f75-c22c-4fe1-91f3-f3df52f55268 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 716.248784] env[61439]: DEBUG nova.compute.provider_tree [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.261183] env[61439]: DEBUG nova.scheduler.client.report [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.289069] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.343s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 716.289708] env[61439]: ERROR nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Traceback (most recent call last): [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self.driver.spawn(context, instance, image_meta, [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self._vmops.spawn(context, instance, image_meta, injected_files, [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] vm_ref = self.build_virtual_machine(instance, [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] vif_infos = vmwarevif.get_vif_info(self._session, [ 716.289708] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] for vif in network_info: [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] return self._sync_wrapper(fn, *args, **kwargs) [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self.wait() [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self[:] = self._gt.wait() [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] return self._exit_event.wait() [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] result = hub.switch() [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 716.290136] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] return self.greenlet.switch() [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] result = function(*args, **kwargs) [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] return func(*args, **kwargs) [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] raise e [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] nwinfo = self.network_api.allocate_for_instance( [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] created_port_ids = self._update_ports_for_instance( [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] with excutils.save_and_reraise_exception(): [ 716.290522] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] self.force_reraise() [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] raise self.value [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] updated_port = self._update_port( [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] _ensure_no_port_binding_failure(port) [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] raise exception.PortBindingFailed(port_id=port['id']) [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] nova.exception.PortBindingFailed: Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. [ 716.290914] env[61439]: ERROR nova.compute.manager [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] [ 716.291256] env[61439]: DEBUG nova.compute.utils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 716.293538] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Build of instance ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197 was re-scheduled: Binding failed for port 2fcf6fc6-cc25-4947-83e2-8491785cb7e2, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 716.293538] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 716.293538] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "refresh_cache-ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 716.293538] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquired lock "refresh_cache-ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 716.293861] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.464962] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.470022] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.029037] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.037595] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.046434] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Releasing lock "refresh_cache-ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 717.046762] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 717.046857] env[61439]: DEBUG nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 717.047027] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 717.056092] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Releasing lock "refresh_cache-de83001c-11a6-4c0d-86c9-9a5e582595bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 717.056582] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 717.056782] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 717.057675] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-893534a8-1d2a-48d9-8645-c513f8aa661f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.070452] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd825255-5fdc-4cd0-a6a5-27f887f3d53a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.095209] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.102481] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance de83001c-11a6-4c0d-86c9-9a5e582595bb could not be found. [ 717.102661] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 717.102840] env[61439]: INFO nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Took 0.05 seconds to destroy the instance on the hypervisor. [ 717.103109] env[61439]: DEBUG oslo.service.loopingcall [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 717.103615] env[61439]: DEBUG nova.compute.manager [-] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 717.103708] env[61439]: DEBUG nova.network.neutron [-] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 717.110263] env[61439]: DEBUG nova.network.neutron [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.126109] env[61439]: INFO nova.compute.manager [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197] Took 0.08 seconds to deallocate network for instance. [ 717.161883] env[61439]: DEBUG nova.network.neutron [-] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.174832] env[61439]: DEBUG nova.network.neutron [-] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.192321] env[61439]: INFO nova.compute.manager [-] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Took 0.09 seconds to deallocate network for instance. [ 717.196398] env[61439]: DEBUG nova.compute.claims [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 717.196398] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 717.196398] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 717.282287] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Successfully created port: c518959c-f1d5-4422-8c26-5d7f93544273 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 717.285443] env[61439]: INFO nova.scheduler.client.report [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Deleted allocations for instance ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197 [ 717.312522] env[61439]: DEBUG oslo_concurrency.lockutils [None req-568d335f-5ba7-48b9-b2ae-8950844d2be5 tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "ae2d23ed-62f5-48d5-bf8f-f5bbf38c9197" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.716s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 717.421896] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2e22954-6737-4859-8190-e02ce0453897 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.432028] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00348314-b55d-4031-a519-478c1c2582a5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.467695] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5f1df77-0d48-4d6d-a97a-e77d59a0ad05 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.476290] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10c26627-5d12-4b04-804e-1d50caad20b2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 717.490076] env[61439]: DEBUG nova.compute.provider_tree [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 717.513431] env[61439]: DEBUG nova.scheduler.client.report [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 717.531065] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.335s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 717.531755] env[61439]: ERROR nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Traceback (most recent call last): [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self.driver.spawn(context, instance, image_meta, [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] vm_ref = self.build_virtual_machine(instance, [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] vif_infos = vmwarevif.get_vif_info(self._session, [ 717.531755] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] for vif in network_info: [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] return self._sync_wrapper(fn, *args, **kwargs) [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self.wait() [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self[:] = self._gt.wait() [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] return self._exit_event.wait() [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] result = hub.switch() [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.532192] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] return self.greenlet.switch() [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] result = function(*args, **kwargs) [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] return func(*args, **kwargs) [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] raise e [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] nwinfo = self.network_api.allocate_for_instance( [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] created_port_ids = self._update_ports_for_instance( [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] with excutils.save_and_reraise_exception(): [ 717.532615] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] self.force_reraise() [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] raise self.value [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] updated_port = self._update_port( [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] _ensure_no_port_binding_failure(port) [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] raise exception.PortBindingFailed(port_id=port['id']) [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] nova.exception.PortBindingFailed: Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. [ 717.532999] env[61439]: ERROR nova.compute.manager [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] [ 717.533319] env[61439]: DEBUG nova.compute.utils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 717.534373] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Build of instance de83001c-11a6-4c0d-86c9-9a5e582595bb was re-scheduled: Binding failed for port 6c845135-77bc-40b8-ab5d-e008de23383a, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 717.534852] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 717.535094] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Acquiring lock "refresh_cache-de83001c-11a6-4c0d-86c9-9a5e582595bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 717.538032] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Acquired lock "refresh_cache-de83001c-11a6-4c0d-86c9-9a5e582595bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 717.538032] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 717.623184] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.662996] env[61439]: ERROR nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. [ 717.662996] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 717.662996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 717.662996] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 717.662996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.662996] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 717.662996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.662996] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 717.662996] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.662996] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 717.662996] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.662996] env[61439]: ERROR nova.compute.manager raise self.value [ 717.662996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.662996] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 717.662996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.662996] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 717.664295] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.664295] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 717.664295] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. [ 717.664295] env[61439]: ERROR nova.compute.manager [ 717.664295] env[61439]: Traceback (most recent call last): [ 717.664295] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 717.664295] env[61439]: listener.cb(fileno) [ 717.664295] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 717.664295] env[61439]: result = function(*args, **kwargs) [ 717.664295] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 717.664295] env[61439]: return func(*args, **kwargs) [ 717.664295] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 717.664295] env[61439]: raise e [ 717.664295] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 717.664295] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 717.664295] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.664295] env[61439]: created_port_ids = self._update_ports_for_instance( [ 717.664295] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.664295] env[61439]: with excutils.save_and_reraise_exception(): [ 717.664295] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.664295] env[61439]: self.force_reraise() [ 717.664295] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.664295] env[61439]: raise self.value [ 717.664295] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.664295] env[61439]: updated_port = self._update_port( [ 717.664295] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.664295] env[61439]: _ensure_no_port_binding_failure(port) [ 717.664295] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.664295] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 717.665194] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. [ 717.665194] env[61439]: Removing descriptor: 23 [ 717.665194] env[61439]: ERROR nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Traceback (most recent call last): [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] yield resources [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self.driver.spawn(context, instance, image_meta, [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 717.665194] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] vm_ref = self.build_virtual_machine(instance, [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] vif_infos = vmwarevif.get_vif_info(self._session, [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] for vif in network_info: [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] return self._sync_wrapper(fn, *args, **kwargs) [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self.wait() [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self[:] = self._gt.wait() [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] return self._exit_event.wait() [ 717.665750] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] result = hub.switch() [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] return self.greenlet.switch() [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] result = function(*args, **kwargs) [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] return func(*args, **kwargs) [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] raise e [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] nwinfo = self.network_api.allocate_for_instance( [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.666204] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] created_port_ids = self._update_ports_for_instance( [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] with excutils.save_and_reraise_exception(): [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self.force_reraise() [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] raise self.value [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] updated_port = self._update_port( [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] _ensure_no_port_binding_failure(port) [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.668447] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] raise exception.PortBindingFailed(port_id=port['id']) [ 717.668856] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] nova.exception.PortBindingFailed: Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. [ 717.668856] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] [ 717.668856] env[61439]: INFO nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Terminating instance [ 717.669720] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Acquiring lock "refresh_cache-ec1394bf-266b-4830-8996-b6221c47c2e1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 717.669984] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Acquired lock "refresh_cache-ec1394bf-266b-4830-8996-b6221c47c2e1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 717.670863] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 717.722888] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.991076] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.002683] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Releasing lock "refresh_cache-de83001c-11a6-4c0d-86c9-9a5e582595bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 718.002683] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 718.002968] env[61439]: DEBUG nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 718.002968] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 718.046128] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.056018] env[61439]: DEBUG nova.network.neutron [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.068339] env[61439]: INFO nova.compute.manager [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] [instance: de83001c-11a6-4c0d-86c9-9a5e582595bb] Took 0.07 seconds to deallocate network for instance. [ 718.095729] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.107909] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Releasing lock "refresh_cache-ec1394bf-266b-4830-8996-b6221c47c2e1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 718.108086] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 718.108275] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 718.110714] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-aa0de16a-7a9a-42a7-bfd6-e95d53ce72eb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 718.120899] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5babbbd-a0d2-4859-a8d7-2d0a0de09268 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 718.148246] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ec1394bf-266b-4830-8996-b6221c47c2e1 could not be found. [ 718.148504] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 718.148721] env[61439]: INFO nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 718.148926] env[61439]: DEBUG oslo.service.loopingcall [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 718.149706] env[61439]: DEBUG nova.compute.manager [-] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 718.149795] env[61439]: DEBUG nova.network.neutron [-] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 718.175108] env[61439]: INFO nova.scheduler.client.report [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Deleted allocations for instance de83001c-11a6-4c0d-86c9-9a5e582595bb [ 718.192034] env[61439]: DEBUG nova.network.neutron [-] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.196302] env[61439]: DEBUG oslo_concurrency.lockutils [None req-939e2f71-1616-4b3f-beed-2129a71943a7 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150 tempest-FloatingIPsAssociationNegativeTestJSON-1058966150-project-member] Lock "de83001c-11a6-4c0d-86c9-9a5e582595bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.624s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 718.209030] env[61439]: DEBUG nova.network.neutron [-] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.218319] env[61439]: INFO nova.compute.manager [-] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Took 0.07 seconds to deallocate network for instance. [ 718.220629] env[61439]: DEBUG nova.compute.claims [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 718.220629] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 718.220629] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 718.330660] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Successfully created port: 9713d50d-981c-43bd-92b0-7c766f373792 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 718.411244] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fecb546-4db5-4582-a5d5-9332b05a2afa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 718.421993] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f119880f-255b-482a-8c62-0e5791b85c0a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 718.465656] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-440b3de9-6c5c-48e9-bb30-b256cb05079b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 718.477749] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28181da5-24be-4d89-ae3c-5acf2b9038e0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 718.500657] env[61439]: DEBUG nova.compute.provider_tree [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 718.526345] env[61439]: DEBUG nova.scheduler.client.report [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 718.552708] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.332s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 718.554145] env[61439]: ERROR nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Traceback (most recent call last): [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self.driver.spawn(context, instance, image_meta, [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] vm_ref = self.build_virtual_machine(instance, [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] vif_infos = vmwarevif.get_vif_info(self._session, [ 718.554145] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] for vif in network_info: [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] return self._sync_wrapper(fn, *args, **kwargs) [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self.wait() [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self[:] = self._gt.wait() [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] return self._exit_event.wait() [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] result = hub.switch() [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.554636] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] return self.greenlet.switch() [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] result = function(*args, **kwargs) [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] return func(*args, **kwargs) [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] raise e [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] nwinfo = self.network_api.allocate_for_instance( [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] created_port_ids = self._update_ports_for_instance( [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] with excutils.save_and_reraise_exception(): [ 718.555099] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] self.force_reraise() [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] raise self.value [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] updated_port = self._update_port( [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] _ensure_no_port_binding_failure(port) [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] raise exception.PortBindingFailed(port_id=port['id']) [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] nova.exception.PortBindingFailed: Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. [ 718.555524] env[61439]: ERROR nova.compute.manager [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] [ 718.555880] env[61439]: DEBUG nova.compute.utils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 718.555880] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Build of instance ec1394bf-266b-4830-8996-b6221c47c2e1 was re-scheduled: Binding failed for port 17a69317-ee7c-4a40-a06a-39537a1c4a67, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 718.558015] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 718.558015] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Acquiring lock "refresh_cache-ec1394bf-266b-4830-8996-b6221c47c2e1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 718.558015] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Acquired lock "refresh_cache-ec1394bf-266b-4830-8996-b6221c47c2e1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 718.558015] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 718.763813] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.030465] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.043466] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Releasing lock "refresh_cache-ec1394bf-266b-4830-8996-b6221c47c2e1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 719.043466] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 719.043466] env[61439]: DEBUG nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 719.043466] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 719.106120] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.124159] env[61439]: DEBUG nova.network.neutron [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.138551] env[61439]: INFO nova.compute.manager [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] [instance: ec1394bf-266b-4830-8996-b6221c47c2e1] Took 0.09 seconds to deallocate network for instance. [ 719.274630] env[61439]: INFO nova.scheduler.client.report [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Deleted allocations for instance ec1394bf-266b-4830-8996-b6221c47c2e1 [ 719.294977] env[61439]: DEBUG oslo_concurrency.lockutils [None req-3f94da63-dab4-43b5-8ccc-161a52557d23 tempest-AttachInterfacesV270Test-1607570909 tempest-AttachInterfacesV270Test-1607570909-project-member] Lock "ec1394bf-266b-4830-8996-b6221c47c2e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.485s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 719.317795] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "b63e50ed-41f9-4e77-9b94-24915b296e2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 719.318313] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "b63e50ed-41f9-4e77-9b94-24915b296e2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 719.334486] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 719.409716] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 719.409981] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 719.411596] env[61439]: INFO nova.compute.claims [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 719.660524] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-179a0c5c-b3a6-4050-af3f-9f398e221d87 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.672433] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c373ca8-8478-41a6-8abe-fea1a15f8cb1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.708437] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64043a0f-87b0-4ed9-be4d-18a945aa8600 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.719192] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29cddd84-52a5-4281-a062-c1aa64aec01c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.734044] env[61439]: DEBUG nova.compute.provider_tree [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 719.745108] env[61439]: DEBUG nova.scheduler.client.report [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 719.761017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 719.761817] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 719.808621] env[61439]: DEBUG nova.compute.utils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 719.811819] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 719.811819] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 719.821937] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 719.896472] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 719.938726] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 719.938953] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 719.939129] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 719.939543] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 719.939717] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 719.939865] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 719.940087] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 719.940248] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 719.940409] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 719.940566] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 719.940734] env[61439]: DEBUG nova.virt.hardware [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 719.941666] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5582848c-b576-4633-b6a5-5a45543f51a7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 719.945527] env[61439]: ERROR nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. [ 719.945527] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 719.945527] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 719.945527] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 719.945527] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 719.945527] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 719.945527] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 719.945527] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 719.945527] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.945527] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 719.945527] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.945527] env[61439]: ERROR nova.compute.manager raise self.value [ 719.945527] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 719.945527] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 719.945527] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.945527] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 719.946139] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.946139] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 719.946139] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. [ 719.946139] env[61439]: ERROR nova.compute.manager [ 719.946139] env[61439]: Traceback (most recent call last): [ 719.946139] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 719.946139] env[61439]: listener.cb(fileno) [ 719.946139] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 719.946139] env[61439]: result = function(*args, **kwargs) [ 719.946139] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 719.946139] env[61439]: return func(*args, **kwargs) [ 719.946139] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 719.946139] env[61439]: raise e [ 719.946139] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 719.946139] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 719.946139] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 719.946139] env[61439]: created_port_ids = self._update_ports_for_instance( [ 719.946139] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 719.946139] env[61439]: with excutils.save_and_reraise_exception(): [ 719.946139] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.946139] env[61439]: self.force_reraise() [ 719.946139] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.946139] env[61439]: raise self.value [ 719.946139] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 719.946139] env[61439]: updated_port = self._update_port( [ 719.946139] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.946139] env[61439]: _ensure_no_port_binding_failure(port) [ 719.946139] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.946139] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 719.946946] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. [ 719.946946] env[61439]: Removing descriptor: 22 [ 719.946946] env[61439]: ERROR nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Traceback (most recent call last): [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] yield resources [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self.driver.spawn(context, instance, image_meta, [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 719.946946] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] vm_ref = self.build_virtual_machine(instance, [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] vif_infos = vmwarevif.get_vif_info(self._session, [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] for vif in network_info: [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] return self._sync_wrapper(fn, *args, **kwargs) [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self.wait() [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self[:] = self._gt.wait() [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] return self._exit_event.wait() [ 719.947726] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] result = hub.switch() [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] return self.greenlet.switch() [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] result = function(*args, **kwargs) [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] return func(*args, **kwargs) [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] raise e [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] nwinfo = self.network_api.allocate_for_instance( [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 719.948186] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] created_port_ids = self._update_ports_for_instance( [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] with excutils.save_and_reraise_exception(): [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self.force_reraise() [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] raise self.value [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] updated_port = self._update_port( [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] _ensure_no_port_binding_failure(port) [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.948599] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] raise exception.PortBindingFailed(port_id=port['id']) [ 719.948997] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] nova.exception.PortBindingFailed: Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. [ 719.948997] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] [ 719.948997] env[61439]: INFO nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Terminating instance [ 719.949920] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Acquiring lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 719.949920] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Acquired lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 719.949920] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 719.955352] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49d681af-853e-40a1-9598-50fc20203fac {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.067496] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 720.226734] env[61439]: DEBUG nova.policy [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b0ff0bbf7696481aa0a4d86c30d3dc7e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e62d95c2bfa4690a3d6662a057b6cd2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 720.506992] env[61439]: DEBUG nova.compute.manager [req-da1f105a-abf0-4285-a56c-c9eb3d6e0702 req-f6cacf59-30f8-4fcf-b804-04519a9ba96d service nova] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Received event network-changed-547f811a-085a-4dec-b008-82e96ca7b6be {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 720.506992] env[61439]: DEBUG nova.compute.manager [req-da1f105a-abf0-4285-a56c-c9eb3d6e0702 req-f6cacf59-30f8-4fcf-b804-04519a9ba96d service nova] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Refreshing instance network info cache due to event network-changed-547f811a-085a-4dec-b008-82e96ca7b6be. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 720.506992] env[61439]: DEBUG oslo_concurrency.lockutils [req-da1f105a-abf0-4285-a56c-c9eb3d6e0702 req-f6cacf59-30f8-4fcf-b804-04519a9ba96d service nova] Acquiring lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 720.902724] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.902724] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Releasing lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 720.902724] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 720.902724] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 720.903014] env[61439]: DEBUG oslo_concurrency.lockutils [req-da1f105a-abf0-4285-a56c-c9eb3d6e0702 req-f6cacf59-30f8-4fcf-b804-04519a9ba96d service nova] Acquired lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 720.903014] env[61439]: DEBUG nova.network.neutron [req-da1f105a-abf0-4285-a56c-c9eb3d6e0702 req-f6cacf59-30f8-4fcf-b804-04519a9ba96d service nova] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Refreshing network info cache for port 547f811a-085a-4dec-b008-82e96ca7b6be {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 720.903014] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8c142919-ea35-486b-9a21-1cf7fe01ed9b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.903014] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4caedfc-c809-40b2-bd30-a2408cb991bd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.934207] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3a0fe362-53a6-48a4-9fdd-cc81586bacfc could not be found. [ 720.934448] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 720.934847] env[61439]: INFO nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Took 0.05 seconds to destroy the instance on the hypervisor. [ 720.934975] env[61439]: DEBUG oslo.service.loopingcall [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 720.935176] env[61439]: DEBUG nova.compute.manager [-] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 720.935353] env[61439]: DEBUG nova.network.neutron [-] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 720.989118] env[61439]: DEBUG nova.network.neutron [req-da1f105a-abf0-4285-a56c-c9eb3d6e0702 req-f6cacf59-30f8-4fcf-b804-04519a9ba96d service nova] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.018715] env[61439]: DEBUG nova.network.neutron [-] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.078555] env[61439]: DEBUG nova.network.neutron [-] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.093584] env[61439]: INFO nova.compute.manager [-] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Took 0.16 seconds to deallocate network for instance. [ 721.096566] env[61439]: DEBUG nova.compute.claims [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 721.096566] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 721.096566] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 721.283808] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4bb0291-e8a7-4b3a-9b29-a677a8d58052 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.293348] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85ba39d5-c04f-431d-953e-c43626278e20 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.328421] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-befb717c-817d-4af8-a14e-8d18dcc3cb3f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.337348] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4426706a-f69f-43d3-a7bd-45dac33be1bf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.352806] env[61439]: DEBUG nova.compute.provider_tree [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 721.365029] env[61439]: DEBUG nova.scheduler.client.report [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 721.381978] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.285s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 721.382622] env[61439]: ERROR nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Traceback (most recent call last): [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self.driver.spawn(context, instance, image_meta, [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] vm_ref = self.build_virtual_machine(instance, [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] vif_infos = vmwarevif.get_vif_info(self._session, [ 721.382622] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] for vif in network_info: [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] return self._sync_wrapper(fn, *args, **kwargs) [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self.wait() [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self[:] = self._gt.wait() [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] return self._exit_event.wait() [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] result = hub.switch() [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 721.382991] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] return self.greenlet.switch() [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] result = function(*args, **kwargs) [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] return func(*args, **kwargs) [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] raise e [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] nwinfo = self.network_api.allocate_for_instance( [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] created_port_ids = self._update_ports_for_instance( [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] with excutils.save_and_reraise_exception(): [ 721.383488] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] self.force_reraise() [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] raise self.value [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] updated_port = self._update_port( [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] _ensure_no_port_binding_failure(port) [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] raise exception.PortBindingFailed(port_id=port['id']) [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] nova.exception.PortBindingFailed: Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. [ 721.383959] env[61439]: ERROR nova.compute.manager [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] [ 721.384389] env[61439]: DEBUG nova.compute.utils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 721.386265] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Build of instance 3a0fe362-53a6-48a4-9fdd-cc81586bacfc was re-scheduled: Binding failed for port 547f811a-085a-4dec-b008-82e96ca7b6be, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 721.386713] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 721.386956] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Acquiring lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 721.498751] env[61439]: DEBUG nova.network.neutron [req-da1f105a-abf0-4285-a56c-c9eb3d6e0702 req-f6cacf59-30f8-4fcf-b804-04519a9ba96d service nova] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.512667] env[61439]: DEBUG oslo_concurrency.lockutils [req-da1f105a-abf0-4285-a56c-c9eb3d6e0702 req-f6cacf59-30f8-4fcf-b804-04519a9ba96d service nova] Releasing lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 721.513527] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Acquired lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 721.513527] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 721.606128] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.996305] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Successfully created port: c142ff91-f8c7-47fa-849e-e56de72d907c {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 722.529414] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.540854] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Releasing lock "refresh_cache-3a0fe362-53a6-48a4-9fdd-cc81586bacfc" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 722.541111] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 722.541301] env[61439]: DEBUG nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 722.541470] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 722.640957] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.650185] env[61439]: DEBUG nova.network.neutron [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.660197] env[61439]: INFO nova.compute.manager [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] [instance: 3a0fe362-53a6-48a4-9fdd-cc81586bacfc] Took 0.12 seconds to deallocate network for instance. [ 722.779946] env[61439]: INFO nova.scheduler.client.report [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Deleted allocations for instance 3a0fe362-53a6-48a4-9fdd-cc81586bacfc [ 722.810281] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9f34a0ba-9d5a-413b-b9bd-87f36d39db09 tempest-ServerActionsTestOtherA-1328246412 tempest-ServerActionsTestOtherA-1328246412-project-member] Lock "3a0fe362-53a6-48a4-9fdd-cc81586bacfc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.722s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 724.305041] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 724.305041] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 724.327467] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 724.411404] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 724.412071] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 724.416361] env[61439]: INFO nova.compute.claims [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 724.692046] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41f4e804-d017-4f0c-a626-360943e1942d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.701638] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba915534-6e61-4e80-8203-2d3d3f635321 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.736285] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c470402-afdc-4a27-9e19-43e2271dc1a2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.746842] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83448138-e6b9-421c-89fa-e1f988a6fe40 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.762654] env[61439]: DEBUG nova.compute.provider_tree [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 724.771820] env[61439]: DEBUG nova.scheduler.client.report [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 724.791375] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.379s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 724.791893] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 724.831110] env[61439]: DEBUG nova.compute.utils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 724.833170] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 724.835630] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 724.847361] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 724.925713] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 724.952615] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 724.953020] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 724.953020] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 724.953547] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 724.953547] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 724.953547] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 724.953685] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 724.953815] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 724.953981] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 724.954159] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 724.954331] env[61439]: DEBUG nova.virt.hardware [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 724.955252] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b2d8a7d-ee81-4451-87ca-f93edb1c1c69 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.964361] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bef4000-a631-4385-ad93-a3947beb91a8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 725.117717] env[61439]: DEBUG nova.policy [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af2fd8431af45ca891f744f4d10b54f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca364a2df93a424f8b66ee39d9b0b120', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 726.083858] env[61439]: ERROR nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. [ 726.083858] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 726.083858] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 726.083858] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 726.083858] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.083858] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 726.083858] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.083858] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 726.083858] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.083858] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 726.083858] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.083858] env[61439]: ERROR nova.compute.manager raise self.value [ 726.083858] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.083858] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 726.083858] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.083858] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 726.084442] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.084442] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 726.084442] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. [ 726.084442] env[61439]: ERROR nova.compute.manager [ 726.084442] env[61439]: Traceback (most recent call last): [ 726.084442] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 726.084442] env[61439]: listener.cb(fileno) [ 726.084442] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 726.084442] env[61439]: result = function(*args, **kwargs) [ 726.084442] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 726.084442] env[61439]: return func(*args, **kwargs) [ 726.084442] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 726.084442] env[61439]: raise e [ 726.084442] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 726.084442] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 726.084442] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.084442] env[61439]: created_port_ids = self._update_ports_for_instance( [ 726.084442] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.084442] env[61439]: with excutils.save_and_reraise_exception(): [ 726.084442] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.084442] env[61439]: self.force_reraise() [ 726.084442] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.084442] env[61439]: raise self.value [ 726.084442] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.084442] env[61439]: updated_port = self._update_port( [ 726.084442] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.084442] env[61439]: _ensure_no_port_binding_failure(port) [ 726.084442] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.084442] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 726.085103] env[61439]: nova.exception.PortBindingFailed: Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. [ 726.085103] env[61439]: Removing descriptor: 10 [ 726.085103] env[61439]: ERROR nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Traceback (most recent call last): [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] yield resources [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self.driver.spawn(context, instance, image_meta, [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 726.085103] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] vm_ref = self.build_virtual_machine(instance, [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] vif_infos = vmwarevif.get_vif_info(self._session, [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] for vif in network_info: [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] return self._sync_wrapper(fn, *args, **kwargs) [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self.wait() [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self[:] = self._gt.wait() [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] return self._exit_event.wait() [ 726.085376] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] result = hub.switch() [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] return self.greenlet.switch() [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] result = function(*args, **kwargs) [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] return func(*args, **kwargs) [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] raise e [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] nwinfo = self.network_api.allocate_for_instance( [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.085660] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] created_port_ids = self._update_ports_for_instance( [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] with excutils.save_and_reraise_exception(): [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self.force_reraise() [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] raise self.value [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] updated_port = self._update_port( [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] _ensure_no_port_binding_failure(port) [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.085942] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] raise exception.PortBindingFailed(port_id=port['id']) [ 726.086936] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] nova.exception.PortBindingFailed: Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. [ 726.086936] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] [ 726.086936] env[61439]: INFO nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Terminating instance [ 726.089387] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Acquiring lock "refresh_cache-3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 726.089547] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Acquired lock "refresh_cache-3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 726.089718] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 726.218828] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.028162] env[61439]: ERROR nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. [ 727.028162] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 727.028162] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 727.028162] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 727.028162] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.028162] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 727.028162] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.028162] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 727.028162] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.028162] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 727.028162] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.028162] env[61439]: ERROR nova.compute.manager raise self.value [ 727.028162] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.028162] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 727.028162] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.028162] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 727.028553] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.028553] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 727.028553] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. [ 727.028553] env[61439]: ERROR nova.compute.manager [ 727.028553] env[61439]: Traceback (most recent call last): [ 727.028553] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 727.028553] env[61439]: listener.cb(fileno) [ 727.028553] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 727.028553] env[61439]: result = function(*args, **kwargs) [ 727.028553] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 727.028553] env[61439]: return func(*args, **kwargs) [ 727.028553] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 727.028553] env[61439]: raise e [ 727.028553] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 727.028553] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 727.028553] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.028553] env[61439]: created_port_ids = self._update_ports_for_instance( [ 727.028553] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.028553] env[61439]: with excutils.save_and_reraise_exception(): [ 727.028553] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.028553] env[61439]: self.force_reraise() [ 727.028553] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.028553] env[61439]: raise self.value [ 727.028553] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.028553] env[61439]: updated_port = self._update_port( [ 727.028553] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.028553] env[61439]: _ensure_no_port_binding_failure(port) [ 727.028553] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.028553] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 727.029258] env[61439]: nova.exception.PortBindingFailed: Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. [ 727.029258] env[61439]: Removing descriptor: 23 [ 727.030359] env[61439]: ERROR nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Traceback (most recent call last): [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] yield resources [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self.driver.spawn(context, instance, image_meta, [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] vm_ref = self.build_virtual_machine(instance, [ 727.030359] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] for vif in network_info: [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] return self._sync_wrapper(fn, *args, **kwargs) [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self.wait() [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self[:] = self._gt.wait() [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] return self._exit_event.wait() [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 727.030697] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] result = hub.switch() [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] return self.greenlet.switch() [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] result = function(*args, **kwargs) [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] return func(*args, **kwargs) [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] raise e [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] nwinfo = self.network_api.allocate_for_instance( [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] created_port_ids = self._update_ports_for_instance( [ 727.031054] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] with excutils.save_and_reraise_exception(): [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self.force_reraise() [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] raise self.value [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] updated_port = self._update_port( [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] _ensure_no_port_binding_failure(port) [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] raise exception.PortBindingFailed(port_id=port['id']) [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] nova.exception.PortBindingFailed: Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. [ 727.031318] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] [ 727.031588] env[61439]: INFO nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Terminating instance [ 727.032494] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "refresh_cache-b63e50ed-41f9-4e77-9b94-24915b296e2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 727.032682] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquired lock "refresh_cache-b63e50ed-41f9-4e77-9b94-24915b296e2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 727.032837] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.057228] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.087312] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.092708] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Releasing lock "refresh_cache-3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 727.093155] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 727.093393] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 727.093926] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b2b29a6d-42a5-462d-b04e-d0b41f32f9b8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.108861] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94a4c21a-b270-45ac-bfed-ad6981e1c22b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.133034] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb could not be found. [ 727.133034] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 727.133177] env[61439]: INFO nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 727.133406] env[61439]: DEBUG oslo.service.loopingcall [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 727.133648] env[61439]: DEBUG nova.compute.manager [-] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 727.133742] env[61439]: DEBUG nova.network.neutron [-] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 727.147699] env[61439]: ERROR nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. [ 727.147699] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 727.147699] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 727.147699] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 727.147699] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.147699] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 727.147699] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.147699] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 727.147699] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.147699] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 727.147699] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.147699] env[61439]: ERROR nova.compute.manager raise self.value [ 727.147699] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.147699] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 727.147699] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.147699] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 727.148123] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.148123] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 727.148123] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. [ 727.148123] env[61439]: ERROR nova.compute.manager [ 727.150982] env[61439]: Traceback (most recent call last): [ 727.150982] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 727.150982] env[61439]: listener.cb(fileno) [ 727.150982] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 727.150982] env[61439]: result = function(*args, **kwargs) [ 727.150982] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 727.150982] env[61439]: return func(*args, **kwargs) [ 727.150982] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 727.150982] env[61439]: raise e [ 727.150982] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 727.150982] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 727.150982] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.150982] env[61439]: created_port_ids = self._update_ports_for_instance( [ 727.150982] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.150982] env[61439]: with excutils.save_and_reraise_exception(): [ 727.150982] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.150982] env[61439]: self.force_reraise() [ 727.150982] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.150982] env[61439]: raise self.value [ 727.150982] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.150982] env[61439]: updated_port = self._update_port( [ 727.150982] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.150982] env[61439]: _ensure_no_port_binding_failure(port) [ 727.150982] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.150982] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 727.150982] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. [ 727.150982] env[61439]: Removing descriptor: 18 [ 727.151684] env[61439]: ERROR nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Traceback (most recent call last): [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] yield resources [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self.driver.spawn(context, instance, image_meta, [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] vm_ref = self.build_virtual_machine(instance, [ 727.151684] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] for vif in network_info: [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] return self._sync_wrapper(fn, *args, **kwargs) [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self.wait() [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self[:] = self._gt.wait() [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] return self._exit_event.wait() [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 727.151913] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] result = hub.switch() [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] return self.greenlet.switch() [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] result = function(*args, **kwargs) [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] return func(*args, **kwargs) [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] raise e [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] nwinfo = self.network_api.allocate_for_instance( [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] created_port_ids = self._update_ports_for_instance( [ 727.152201] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] with excutils.save_and_reraise_exception(): [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self.force_reraise() [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] raise self.value [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] updated_port = self._update_port( [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] _ensure_no_port_binding_failure(port) [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] raise exception.PortBindingFailed(port_id=port['id']) [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] nova.exception.PortBindingFailed: Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. [ 727.152491] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] [ 727.152819] env[61439]: INFO nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Terminating instance [ 727.153927] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-c9aebd15-4afa-47b4-9072-40ea371c2857" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 727.153927] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-c9aebd15-4afa-47b4-9072-40ea371c2857" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 727.154162] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.252546] env[61439]: DEBUG nova.network.neutron [-] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.264813] env[61439]: DEBUG nova.network.neutron [-] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.276413] env[61439]: INFO nova.compute.manager [-] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Took 0.14 seconds to deallocate network for instance. [ 727.280935] env[61439]: DEBUG nova.compute.claims [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 727.281373] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 727.282503] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 727.286833] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.447606] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.456379] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Successfully created port: 6cbd0454-8188-4cc6-859b-95c690163dd2 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 727.489114] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-060d25d4-5608-48dd-977b-e2664f8b4433 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.491632] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Releasing lock "refresh_cache-b63e50ed-41f9-4e77-9b94-24915b296e2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 727.492411] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 727.492631] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 727.496049] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9bc62134-d16c-47ca-b6fb-7916376d407c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.503078] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef25a160-5d09-465b-b62e-73ce2c08560f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.509718] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58df1082-ad53-4c8d-ac64-a05debe9be5b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.547492] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-448642f7-5fcd-4c74-aa12-ecc4b3e3bc82 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.560369] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0858408a-6028-407b-b074-bb057f3ac636 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.567370] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b63e50ed-41f9-4e77-9b94-24915b296e2d could not be found. [ 727.567370] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 727.567370] env[61439]: INFO nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Took 0.07 seconds to destroy the instance on the hypervisor. [ 727.567370] env[61439]: DEBUG oslo.service.loopingcall [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 727.568086] env[61439]: DEBUG nova.compute.manager [-] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 727.568086] env[61439]: DEBUG nova.network.neutron [-] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 727.581097] env[61439]: DEBUG nova.compute.provider_tree [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 727.592088] env[61439]: DEBUG nova.scheduler.client.report [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 727.618617] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.337s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 727.621325] env[61439]: ERROR nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Traceback (most recent call last): [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self.driver.spawn(context, instance, image_meta, [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] vm_ref = self.build_virtual_machine(instance, [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.621325] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] for vif in network_info: [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] return self._sync_wrapper(fn, *args, **kwargs) [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self.wait() [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self[:] = self._gt.wait() [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] return self._exit_event.wait() [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] result = hub.switch() [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.621827] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] return self.greenlet.switch() [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] result = function(*args, **kwargs) [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] return func(*args, **kwargs) [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] raise e [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] nwinfo = self.network_api.allocate_for_instance( [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] created_port_ids = self._update_ports_for_instance( [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] with excutils.save_and_reraise_exception(): [ 727.622167] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] self.force_reraise() [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] raise self.value [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] updated_port = self._update_port( [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] _ensure_no_port_binding_failure(port) [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] raise exception.PortBindingFailed(port_id=port['id']) [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] nova.exception.PortBindingFailed: Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. [ 727.622464] env[61439]: ERROR nova.compute.manager [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] [ 727.622744] env[61439]: DEBUG nova.compute.utils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 727.625358] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Build of instance 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb was re-scheduled: Binding failed for port c518959c-f1d5-4422-8c26-5d7f93544273, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 727.625540] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 727.625688] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Acquiring lock "refresh_cache-3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 727.625874] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Acquired lock "refresh_cache-3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 727.626364] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.643458] env[61439]: DEBUG nova.network.neutron [-] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.657664] env[61439]: DEBUG nova.network.neutron [-] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.666616] env[61439]: INFO nova.compute.manager [-] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Took 0.10 seconds to deallocate network for instance. [ 727.668922] env[61439]: DEBUG nova.compute.claims [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 727.669145] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 727.669423] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 727.843386] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8464980-df91-462b-812d-a19544bdf750 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.849022] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.853504] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8e1dc97-eee8-4a84-9f3a-68d57d3ede39 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.883684] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69c1f450-e113-4d7b-bf32-fece57d4e27a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.891019] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7b3d0db-09e9-4ee2-b0ba-770424dc65c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.906097] env[61439]: DEBUG nova.compute.provider_tree [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 727.915475] env[61439]: DEBUG nova.scheduler.client.report [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 727.943643] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.274s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 727.944327] env[61439]: ERROR nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Traceback (most recent call last): [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self.driver.spawn(context, instance, image_meta, [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] vm_ref = self.build_virtual_machine(instance, [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.944327] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] for vif in network_info: [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] return self._sync_wrapper(fn, *args, **kwargs) [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self.wait() [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self[:] = self._gt.wait() [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] return self._exit_event.wait() [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] result = hub.switch() [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.944629] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] return self.greenlet.switch() [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] result = function(*args, **kwargs) [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] return func(*args, **kwargs) [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] raise e [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] nwinfo = self.network_api.allocate_for_instance( [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] created_port_ids = self._update_ports_for_instance( [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] with excutils.save_and_reraise_exception(): [ 727.944933] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] self.force_reraise() [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] raise self.value [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] updated_port = self._update_port( [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] _ensure_no_port_binding_failure(port) [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] raise exception.PortBindingFailed(port_id=port['id']) [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] nova.exception.PortBindingFailed: Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. [ 727.945306] env[61439]: ERROR nova.compute.manager [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] [ 727.945575] env[61439]: DEBUG nova.compute.utils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 727.946692] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Build of instance b63e50ed-41f9-4e77-9b94-24915b296e2d was re-scheduled: Binding failed for port c142ff91-f8c7-47fa-849e-e56de72d907c, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 727.947134] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 727.947364] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquiring lock "refresh_cache-b63e50ed-41f9-4e77-9b94-24915b296e2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 727.947518] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Acquired lock "refresh_cache-b63e50ed-41f9-4e77-9b94-24915b296e2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 727.947675] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.049421] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.057257] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "2f7123c7-f863-4bb6-a899-5feb618c6ce0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.057512] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "2f7123c7-f863-4bb6-a899-5feb618c6ce0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 728.076317] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 728.089943] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "74e5ad9b-1b5b-492d-8642-ed271d8f70e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.090237] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "74e5ad9b-1b5b-492d-8642-ed271d8f70e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 728.106651] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 728.127589] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "d79dac42-38fa-401b-9864-5fbdf80b89ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.127589] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "d79dac42-38fa-401b-9864-5fbdf80b89ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 728.135093] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 728.153813] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.156267] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.157113] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 728.157971] env[61439]: INFO nova.compute.claims [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 728.163916] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-c9aebd15-4afa-47b4-9072-40ea371c2857" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 728.164330] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 728.164546] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 728.167162] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ef986d7e-50f1-4619-bea7-3fe9785ef2fb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.180065] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6b4dc03-505a-4c14-adc9-3d5e7ba71ec0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.194533] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.208378] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c9aebd15-4afa-47b4-9072-40ea371c2857 could not be found. [ 728.208612] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 728.208793] env[61439]: INFO nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Took 0.04 seconds to destroy the instance on the hypervisor. [ 728.209060] env[61439]: DEBUG oslo.service.loopingcall [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 728.210049] env[61439]: DEBUG nova.compute.manager [-] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 728.210153] env[61439]: DEBUG nova.network.neutron [-] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.233381] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.316401] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.318155] env[61439]: DEBUG nova.network.neutron [-] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.327028] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Releasing lock "refresh_cache-b63e50ed-41f9-4e77-9b94-24915b296e2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 728.327290] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 728.327442] env[61439]: DEBUG nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 728.328208] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.332801] env[61439]: DEBUG nova.network.neutron [-] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.348541] env[61439]: INFO nova.compute.manager [-] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Took 0.14 seconds to deallocate network for instance. [ 728.354623] env[61439]: DEBUG nova.compute.claims [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 728.354795] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.413940] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26545e65-c1fc-41ae-a6b6-b66bba37465b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.425937] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1359c7bf-d077-49a7-9144-0a5346c5dc2f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.431415] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.465406] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb7203d1-4253-4cf8-a29f-3d2ce67da20a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.468491] env[61439]: DEBUG nova.network.neutron [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.476176] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18fadad3-e3e3-46c1-9620-ee6351c44503 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.481932] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.495292] env[61439]: DEBUG nova.compute.provider_tree [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 728.496863] env[61439]: INFO nova.compute.manager [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] [instance: b63e50ed-41f9-4e77-9b94-24915b296e2d] Took 0.17 seconds to deallocate network for instance. [ 728.501163] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Releasing lock "refresh_cache-3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 728.501163] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 728.501163] env[61439]: DEBUG nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 728.501163] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.517854] env[61439]: DEBUG nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 728.547301] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.391s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 728.547822] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 728.553649] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.359s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 728.555107] env[61439]: INFO nova.compute.claims [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 728.577137] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.593547] env[61439]: DEBUG nova.network.neutron [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.606711] env[61439]: DEBUG nova.compute.utils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 728.607624] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 728.607793] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 728.614424] env[61439]: INFO nova.compute.manager [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] [instance: 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb] Took 0.11 seconds to deallocate network for instance. [ 728.648869] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 728.691986] env[61439]: INFO nova.scheduler.client.report [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Deleted allocations for instance b63e50ed-41f9-4e77-9b94-24915b296e2d [ 728.765549] env[61439]: DEBUG oslo_concurrency.lockutils [None req-37aff3b4-8eb2-4c6d-8ae0-6649f5a5f754 tempest-VolumesAdminNegativeTest-1485637538 tempest-VolumesAdminNegativeTest-1485637538-project-member] Lock "b63e50ed-41f9-4e77-9b94-24915b296e2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.447s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 728.808287] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 728.813259] env[61439]: INFO nova.scheduler.client.report [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Deleted allocations for instance 3495611c-05aa-4c1c-89fc-16ecb1fcb6bb [ 728.848862] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8ced7a8c-1107-466d-bcc2-b14e2a058fa2 tempest-ServersTestManualDisk-714945568 tempest-ServersTestManualDisk-714945568-project-member] Lock "3495611c-05aa-4c1c-89fc-16ecb1fcb6bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.693s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 728.869525] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 728.869525] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 728.869525] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 728.869708] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 728.870376] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 728.874023] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 728.874023] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 728.874023] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 728.874023] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 728.874023] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 728.874367] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 728.874367] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3026f711-86a7-4895-9c6f-6bbe478dd5d8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.887622] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57d5a2d7-bac0-4f39-b958-37d7d86d0af2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.951152] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Acquiring lock "2398e0a0-f13e-47e7-b735-906694ea4d58" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.951783] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Lock "2398e0a0-f13e-47e7-b735-906694ea4d58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 728.965811] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 728.980591] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38fa2efe-c589-403a-b566-39335dcf708b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.995620] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec906cc5-2475-4e16-8857-317e45d9430a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.036667] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b0a6cb0-f536-4e1f-8051-9b2e76dba290 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.048969] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cef810b3-97eb-4b26-b075-86d45cb5b8e5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.063955] env[61439]: DEBUG nova.compute.provider_tree [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.070853] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 729.080319] env[61439]: DEBUG nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.109450] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.556s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 729.109977] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 729.112900] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.879s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 729.113842] env[61439]: INFO nova.compute.claims [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 729.169298] env[61439]: DEBUG nova.compute.utils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 729.174026] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 729.174026] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 729.189016] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 729.285566] env[61439]: DEBUG nova.policy [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d3dcf628fb041a2a1e7885e53c961b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dbc4594ef23b4419ac60a7a2195b2a94', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 729.316999] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 729.350639] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 729.350639] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 729.350639] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 729.351011] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 729.351011] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 729.351011] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 729.351011] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 729.351011] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 729.351334] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 729.351879] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 729.351879] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 729.352673] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99d9c191-66bb-4b8b-9348-8b31a3646ebb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.369100] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f71ca177-9c05-4523-9b73-5f4fce9a0591 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.417515] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ada363e4-5b46-4222-9585-a912e05c8375 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.430325] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e6e365e-8ac4-441e-855d-13b0b21cd1f8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.464602] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-750c2ee3-db16-4891-b760-b8d111ed0bcf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.471692] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-078e863e-c058-4acf-8ec3-902152fd1dd8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.485601] env[61439]: DEBUG nova.compute.provider_tree [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.497703] env[61439]: DEBUG nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.520909] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.408s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 729.521749] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 729.524665] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 1.170s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 729.538269] env[61439]: DEBUG nova.policy [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d3dcf628fb041a2a1e7885e53c961b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dbc4594ef23b4419ac60a7a2195b2a94', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 729.578999] env[61439]: DEBUG nova.compute.utils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 729.580846] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 729.581025] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 729.594908] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 729.670177] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 729.699535] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 729.699806] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 729.699965] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 729.700825] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 729.701025] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 729.701320] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 729.701555] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 729.701800] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 729.702117] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 729.702375] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 729.702593] env[61439]: DEBUG nova.virt.hardware [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 729.704176] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-836444af-eae1-4cb4-9dd6-5fa5479ca5e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.716492] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14ab71ee-ffe9-41d0-994a-5a524dafc1b1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.739915] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7eceaf9-21f0-4a4a-99dc-3772ef462323 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.747422] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceef27a7-06db-4336-a8d1-e05f0701a62e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.778585] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8805771-f64a-4450-bec3-c5e27b817ff6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.786065] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c12cbc-852e-41a9-b620-ad372876da7e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.800380] env[61439]: DEBUG nova.compute.provider_tree [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.812298] env[61439]: DEBUG nova.scheduler.client.report [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.826351] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.302s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 729.826970] env[61439]: ERROR nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Traceback (most recent call last): [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self.driver.spawn(context, instance, image_meta, [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] vm_ref = self.build_virtual_machine(instance, [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] vif_infos = vmwarevif.get_vif_info(self._session, [ 729.826970] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] for vif in network_info: [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] return self._sync_wrapper(fn, *args, **kwargs) [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self.wait() [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self[:] = self._gt.wait() [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] return self._exit_event.wait() [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] result = hub.switch() [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.828264] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] return self.greenlet.switch() [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] result = function(*args, **kwargs) [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] return func(*args, **kwargs) [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] raise e [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] nwinfo = self.network_api.allocate_for_instance( [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] created_port_ids = self._update_ports_for_instance( [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] with excutils.save_and_reraise_exception(): [ 729.829304] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] self.force_reraise() [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] raise self.value [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] updated_port = self._update_port( [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] _ensure_no_port_binding_failure(port) [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] raise exception.PortBindingFailed(port_id=port['id']) [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] nova.exception.PortBindingFailed: Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. [ 729.829576] env[61439]: ERROR nova.compute.manager [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] [ 729.829807] env[61439]: DEBUG nova.compute.utils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 729.829807] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.758s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 729.830519] env[61439]: INFO nova.compute.claims [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 729.835039] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Build of instance c9aebd15-4afa-47b4-9072-40ea371c2857 was re-scheduled: Binding failed for port 9713d50d-981c-43bd-92b0-7c766f373792, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 729.835039] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 729.835039] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-c9aebd15-4afa-47b4-9072-40ea371c2857" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 729.835039] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-c9aebd15-4afa-47b4-9072-40ea371c2857" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 729.835244] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 729.854966] env[61439]: DEBUG nova.policy [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d3dcf628fb041a2a1e7885e53c961b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dbc4594ef23b4419ac60a7a2195b2a94', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 729.904052] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.028950] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66f0ee14-14bc-42eb-8c05-675c4283095c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.036758] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f641bb9-5259-4323-8e7d-8c844b87e0ff {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.068817] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b033f3-8cc1-43fd-aabd-5423fc39f15f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.076161] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a545ffee-de67-44eb-81be-75c075788768 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.089381] env[61439]: DEBUG nova.compute.provider_tree [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.098171] env[61439]: DEBUG nova.scheduler.client.report [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.113073] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 730.113631] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 730.149115] env[61439]: DEBUG nova.compute.utils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 730.151259] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 730.151433] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 730.159655] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 730.235486] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 730.256536] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 730.256772] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 730.257035] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 730.257122] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 730.257273] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 730.257421] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 730.257630] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 730.257790] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 730.257958] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 730.258132] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 730.258304] env[61439]: DEBUG nova.virt.hardware [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 730.259157] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ef95d89-9933-4363-8c5d-d4ff5ce5cfc8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.267611] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d81da472-d63e-48a3-926c-79c11e4f12d4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.672098] env[61439]: DEBUG nova.policy [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '034c0fef0ca64d55b894d99c202b880d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1e76f73da19144f5b09ffee1f1d8853a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 730.731814] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.742951] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-c9aebd15-4afa-47b4-9072-40ea371c2857" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 730.743223] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 730.743407] env[61439]: DEBUG nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 730.743620] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 730.820867] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.831456] env[61439]: DEBUG nova.network.neutron [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.845153] env[61439]: INFO nova.compute.manager [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: c9aebd15-4afa-47b4-9072-40ea371c2857] Took 0.10 seconds to deallocate network for instance. [ 730.959788] env[61439]: INFO nova.scheduler.client.report [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Deleted allocations for instance c9aebd15-4afa-47b4-9072-40ea371c2857 [ 730.991233] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8a009510-42db-4540-8479-8a6ed2a40dbd tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "c9aebd15-4afa-47b4-9072-40ea371c2857" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.055s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 731.924021] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Successfully created port: f7421b70-6d30-4120-be13-e6378ad0ffdb {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 732.051813] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Successfully created port: 17e9cb25-c36e-4d26-b65a-e0135ece2194 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 732.148018] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Successfully created port: e6d1e53a-74f0-4da3-8ede-885bbd0c7111 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 732.754693] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Successfully created port: 905d20ea-65b5-4d7a-a080-9a1bfb47fb42 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 732.893747] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Acquiring lock "abdb8903-7574-40a4-ac7e-345b22fe1141" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.894009] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Lock "abdb8903-7574-40a4-ac7e-345b22fe1141" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.906294] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 732.966086] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.966457] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.967932] env[61439]: INFO nova.compute.claims [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 733.194136] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6caa1951-07b1-4210-8e5a-f13701c08b9e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.203863] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d58e6e0a-4164-45b1-89b6-c3a4c91b6d04 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.240603] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28c662e7-b006-4912-8a87-31df49c9b31a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.249408] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3f4d350-297c-4650-b27f-4a2dec5f63ff {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.266877] env[61439]: DEBUG nova.compute.provider_tree [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 733.283103] env[61439]: DEBUG nova.scheduler.client.report [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 733.306082] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 733.306908] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 733.357984] env[61439]: DEBUG nova.compute.utils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 733.359702] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 733.359789] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 733.372202] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 733.485723] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 733.523447] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 733.523765] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 733.523991] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 733.524423] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 733.524423] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 733.525085] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 733.525085] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 733.525266] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 733.525499] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 733.525892] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 733.525938] env[61439]: DEBUG nova.virt.hardware [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 733.527990] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc9235f1-7448-4d40-a144-d5b6d0a55b3f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.537553] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1e4592b-b0f3-4182-9aaf-6db736c09daf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.590515] env[61439]: DEBUG nova.policy [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '02802ee5cbb14e6f9344e7479cfa2227', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26026d6a7e084a0f81eba6e8b01d81d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 735.089696] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Successfully created port: 55d06218-1ea4-47f7-81e5-59f167d3ea24 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 736.391265] env[61439]: WARNING oslo_vmware.rw_handles [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 736.391265] env[61439]: ERROR oslo_vmware.rw_handles [ 736.391773] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 736.394157] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 736.394616] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Copying Virtual Disk [datastore2] vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/946b6cdf-92de-4e95-9322-e85f342255a0/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 736.394925] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1aa10d8a-9b9b-4e10-816e-6005a3da27d1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 736.404110] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Waiting for the task: (returnval){ [ 736.404110] env[61439]: value = "task-987667" [ 736.404110] env[61439]: _type = "Task" [ 736.404110] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 736.414423] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Task: {'id': task-987667, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 736.610351] env[61439]: ERROR nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. [ 736.610351] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 736.610351] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 736.610351] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 736.610351] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 736.610351] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 736.610351] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 736.610351] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 736.610351] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 736.610351] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 736.610351] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 736.610351] env[61439]: ERROR nova.compute.manager raise self.value [ 736.610351] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 736.610351] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 736.610351] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 736.610351] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 736.610791] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 736.610791] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 736.610791] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. [ 736.610791] env[61439]: ERROR nova.compute.manager [ 736.610791] env[61439]: Traceback (most recent call last): [ 736.610791] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 736.610791] env[61439]: listener.cb(fileno) [ 736.610791] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 736.610791] env[61439]: result = function(*args, **kwargs) [ 736.610791] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 736.610791] env[61439]: return func(*args, **kwargs) [ 736.610791] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 736.610791] env[61439]: raise e [ 736.610791] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 736.610791] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 736.610791] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 736.610791] env[61439]: created_port_ids = self._update_ports_for_instance( [ 736.610791] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 736.610791] env[61439]: with excutils.save_and_reraise_exception(): [ 736.610791] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 736.610791] env[61439]: self.force_reraise() [ 736.610791] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 736.610791] env[61439]: raise self.value [ 736.610791] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 736.610791] env[61439]: updated_port = self._update_port( [ 736.610791] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 736.610791] env[61439]: _ensure_no_port_binding_failure(port) [ 736.610791] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 736.610791] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 736.611471] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. [ 736.611471] env[61439]: Removing descriptor: 22 [ 736.611471] env[61439]: ERROR nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Traceback (most recent call last): [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] yield resources [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self.driver.spawn(context, instance, image_meta, [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 736.611471] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] vm_ref = self.build_virtual_machine(instance, [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] vif_infos = vmwarevif.get_vif_info(self._session, [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] for vif in network_info: [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] return self._sync_wrapper(fn, *args, **kwargs) [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self.wait() [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self[:] = self._gt.wait() [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] return self._exit_event.wait() [ 736.611740] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] result = hub.switch() [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] return self.greenlet.switch() [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] result = function(*args, **kwargs) [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] return func(*args, **kwargs) [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] raise e [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] nwinfo = self.network_api.allocate_for_instance( [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 736.612041] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] created_port_ids = self._update_ports_for_instance( [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] with excutils.save_and_reraise_exception(): [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self.force_reraise() [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] raise self.value [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] updated_port = self._update_port( [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] _ensure_no_port_binding_failure(port) [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 736.612340] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] raise exception.PortBindingFailed(port_id=port['id']) [ 736.612712] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] nova.exception.PortBindingFailed: Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. [ 736.612712] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] [ 736.612712] env[61439]: INFO nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Terminating instance [ 736.615129] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 736.615129] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 736.615129] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 736.741990] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.917506] env[61439]: DEBUG oslo_vmware.exceptions [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 736.917850] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 736.918507] env[61439]: ERROR nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 736.918507] env[61439]: Faults: ['InvalidArgument'] [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Traceback (most recent call last): [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] yield resources [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] self.driver.spawn(context, instance, image_meta, [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] self._fetch_image_if_missing(context, vi) [ 736.918507] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] image_cache(vi, tmp_image_ds_loc) [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] vm_util.copy_virtual_disk( [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] session._wait_for_task(vmdk_copy_task) [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] return self.wait_for_task(task_ref) [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] return evt.wait() [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] result = hub.switch() [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 736.918874] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] return self.greenlet.switch() [ 736.919198] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 736.919198] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] self.f(*self.args, **self.kw) [ 736.919198] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 736.919198] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] raise exceptions.translate_fault(task_info.error) [ 736.919198] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 736.919198] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Faults: ['InvalidArgument'] [ 736.919198] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] [ 736.919198] env[61439]: INFO nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Terminating instance [ 736.921040] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 736.921251] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 736.921798] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquiring lock "refresh_cache-a41cb33f-8340-4b15-b19d-0a7b9396eae7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 736.921996] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquired lock "refresh_cache-a41cb33f-8340-4b15-b19d-0a7b9396eae7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 736.922236] env[61439]: DEBUG nova.network.neutron [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 736.923229] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bfcba3ef-3dac-4fba-a5c7-3bd340fa955a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 736.931622] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 736.932949] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 736.934405] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a1967c40-6f5d-4b3c-8899-8b1481a5fecd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 736.943014] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for the task: (returnval){ [ 736.943014] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c41b9c-b022-65db-b9ac-e7de11da0505" [ 736.943014] env[61439]: _type = "Task" [ 736.943014] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 736.953019] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c41b9c-b022-65db-b9ac-e7de11da0505, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 737.064227] env[61439]: DEBUG nova.network.neutron [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 737.209604] env[61439]: DEBUG nova.compute.manager [req-ec7e1f20-dc57-4335-8b8d-ec85db8b8a77 req-b4eecc69-40cb-49ae-9256-b8c2d07709bd service nova] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Received event network-changed-6cbd0454-8188-4cc6-859b-95c690163dd2 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 737.210045] env[61439]: DEBUG nova.compute.manager [req-ec7e1f20-dc57-4335-8b8d-ec85db8b8a77 req-b4eecc69-40cb-49ae-9256-b8c2d07709bd service nova] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Refreshing instance network info cache due to event network-changed-6cbd0454-8188-4cc6-859b-95c690163dd2. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 737.210045] env[61439]: DEBUG oslo_concurrency.lockutils [req-ec7e1f20-dc57-4335-8b8d-ec85db8b8a77 req-b4eecc69-40cb-49ae-9256-b8c2d07709bd service nova] Acquiring lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 737.456667] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 737.459180] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Creating directory with path [datastore2] vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 737.459180] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e9954740-4a05-46ac-83bd-c82487c69538 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.479327] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Created directory with path [datastore2] vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 737.479327] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Fetch image to [datastore2] vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 737.479527] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 737.480528] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bb82993-a4d4-4177-993a-c09a01e97ab6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.489108] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-733183c1-5bff-4d6a-a0e5-af3e381c4172 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.500289] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f79483c-0389-4fce-b9f5-ecc6901643f1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.538137] env[61439]: DEBUG nova.network.neutron [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.541408] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6077d7b-18f9-4380-aa1e-9dfd5f57b6f9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.551522] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4a7eddeb-77dc-4636-af31-2267021c9371 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.554860] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Releasing lock "refresh_cache-a41cb33f-8340-4b15-b19d-0a7b9396eae7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 737.554971] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 737.555178] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 737.556392] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.558774] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2525529f-0f42-44d9-8f46-1e9a4cbb9962 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.566052] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 737.566193] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b96f064a-df40-4e95-ae2c-8ca35e8822f6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.570453] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 737.570453] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 737.570453] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 737.570453] env[61439]: DEBUG oslo_concurrency.lockutils [req-ec7e1f20-dc57-4335-8b8d-ec85db8b8a77 req-b4eecc69-40cb-49ae-9256-b8c2d07709bd service nova] Acquired lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 737.570453] env[61439]: DEBUG nova.network.neutron [req-ec7e1f20-dc57-4335-8b8d-ec85db8b8a77 req-b4eecc69-40cb-49ae-9256-b8c2d07709bd service nova] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Refreshing network info cache for port 6cbd0454-8188-4cc6-859b-95c690163dd2 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 737.571970] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-516fa9a4-cdf4-40c3-b618-e99dded10a2c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.578473] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 737.598559] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71779ce0-2a49-4586-83dc-4ba11ba40e27 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.610077] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 737.610175] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 737.610287] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Deleting the datastore file [datastore2] a41cb33f-8340-4b15-b19d-0a7b9396eae7 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 737.612588] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7c9019d8-fd11-43ff-94d8-246ed450ab3e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.623171] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Waiting for the task: (returnval){ [ 737.623171] env[61439]: value = "task-987669" [ 737.623171] env[61439]: _type = "Task" [ 737.623171] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 737.634179] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4 could not be found. [ 737.634407] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 737.634644] env[61439]: INFO nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Took 0.06 seconds to destroy the instance on the hypervisor. [ 737.635068] env[61439]: DEBUG oslo.service.loopingcall [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 737.638978] env[61439]: DEBUG nova.compute.manager [-] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 737.639685] env[61439]: DEBUG nova.network.neutron [-] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 737.646939] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Task: {'id': task-987669, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 737.669931] env[61439]: DEBUG oslo_vmware.rw_handles [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 737.736905] env[61439]: DEBUG nova.network.neutron [req-ec7e1f20-dc57-4335-8b8d-ec85db8b8a77 req-b4eecc69-40cb-49ae-9256-b8c2d07709bd service nova] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 737.741325] env[61439]: DEBUG oslo_vmware.rw_handles [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 737.742126] env[61439]: DEBUG oslo_vmware.rw_handles [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 737.783577] env[61439]: DEBUG nova.network.neutron [-] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 737.797410] env[61439]: DEBUG nova.network.neutron [-] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.818095] env[61439]: INFO nova.compute.manager [-] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Took 0.18 seconds to deallocate network for instance. [ 737.819298] env[61439]: DEBUG nova.compute.claims [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 737.819516] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 737.819896] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 738.085050] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2e523cd-efb9-4969-81d8-af586d13d8bd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.098944] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94ff8fa5-4169-48f7-8291-e9342a91dd25 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.142702] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78d74032-547b-4130-a84c-21dd838737e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.149269] env[61439]: DEBUG oslo_vmware.api [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Task: {'id': task-987669, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.043733} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 738.155239] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 738.155239] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 738.155239] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 738.155239] env[61439]: INFO nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 738.155239] env[61439]: DEBUG oslo.service.loopingcall [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 738.155383] env[61439]: DEBUG nova.compute.manager [-] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 738.155383] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bd0c50c-db98-467b-9bd3-d07614803ddf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.164707] env[61439]: DEBUG nova.compute.claims [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 738.164881] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 738.175471] env[61439]: DEBUG nova.compute.provider_tree [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 738.188336] env[61439]: DEBUG nova.scheduler.client.report [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 738.215119] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.395s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 738.215769] env[61439]: ERROR nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Traceback (most recent call last): [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self.driver.spawn(context, instance, image_meta, [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] vm_ref = self.build_virtual_machine(instance, [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] vif_infos = vmwarevif.get_vif_info(self._session, [ 738.215769] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] for vif in network_info: [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] return self._sync_wrapper(fn, *args, **kwargs) [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self.wait() [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self[:] = self._gt.wait() [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] return self._exit_event.wait() [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] result = hub.switch() [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 738.216093] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] return self.greenlet.switch() [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] result = function(*args, **kwargs) [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] return func(*args, **kwargs) [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] raise e [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] nwinfo = self.network_api.allocate_for_instance( [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] created_port_ids = self._update_ports_for_instance( [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] with excutils.save_and_reraise_exception(): [ 738.216378] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] self.force_reraise() [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] raise self.value [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] updated_port = self._update_port( [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] _ensure_no_port_binding_failure(port) [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] raise exception.PortBindingFailed(port_id=port['id']) [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] nova.exception.PortBindingFailed: Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. [ 738.216673] env[61439]: ERROR nova.compute.manager [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] [ 738.216932] env[61439]: DEBUG nova.compute.utils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 738.217917] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.053s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 738.222757] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Build of instance fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4 was re-scheduled: Binding failed for port 6cbd0454-8188-4cc6-859b-95c690163dd2, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 738.222757] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 738.222757] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 738.336175] env[61439]: DEBUG nova.network.neutron [req-ec7e1f20-dc57-4335-8b8d-ec85db8b8a77 req-b4eecc69-40cb-49ae-9256-b8c2d07709bd service nova] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.354817] env[61439]: DEBUG oslo_concurrency.lockutils [req-ec7e1f20-dc57-4335-8b8d-ec85db8b8a77 req-b4eecc69-40cb-49ae-9256-b8c2d07709bd service nova] Releasing lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 738.355265] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 738.355449] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 738.479034] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2894c39b-1a64-47bd-a687-0fa9605e7e84 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.491329] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38100397-efd8-40cc-94b8-b57131602539 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.494589] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.531464] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ca63378-d7c2-4527-a1f7-33651e46fdd5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.546752] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fe81390-d885-40a5-9263-60509ff5cb2e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.563611] env[61439]: DEBUG nova.compute.provider_tree [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 738.575416] env[61439]: DEBUG nova.scheduler.client.report [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 738.592671] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.375s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 738.593226] env[61439]: ERROR nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 738.593226] env[61439]: Faults: ['InvalidArgument'] [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Traceback (most recent call last): [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] self.driver.spawn(context, instance, image_meta, [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] self._fetch_image_if_missing(context, vi) [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] image_cache(vi, tmp_image_ds_loc) [ 738.593226] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] vm_util.copy_virtual_disk( [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] session._wait_for_task(vmdk_copy_task) [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] return self.wait_for_task(task_ref) [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] return evt.wait() [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] result = hub.switch() [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] return self.greenlet.switch() [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 738.593610] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] self.f(*self.args, **self.kw) [ 738.593947] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 738.593947] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] raise exceptions.translate_fault(task_info.error) [ 738.593947] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 738.593947] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Faults: ['InvalidArgument'] [ 738.593947] env[61439]: ERROR nova.compute.manager [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] [ 738.594076] env[61439]: DEBUG nova.compute.utils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 738.595505] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Build of instance a41cb33f-8340-4b15-b19d-0a7b9396eae7 was re-scheduled: A specified parameter was not correct: fileType [ 738.595505] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 738.595878] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 738.596117] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquiring lock "refresh_cache-a41cb33f-8340-4b15-b19d-0a7b9396eae7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 738.600624] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Acquired lock "refresh_cache-a41cb33f-8340-4b15-b19d-0a7b9396eae7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 738.600624] env[61439]: DEBUG nova.network.neutron [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 738.815773] env[61439]: DEBUG nova.network.neutron [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.347292] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.360710] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 739.360946] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 739.361682] env[61439]: DEBUG nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 739.362162] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 739.439749] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.453213] env[61439]: DEBUG nova.network.neutron [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.466234] env[61439]: INFO nova.compute.manager [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4] Took 0.10 seconds to deallocate network for instance. [ 739.522622] env[61439]: DEBUG nova.network.neutron [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.542081] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Releasing lock "refresh_cache-a41cb33f-8340-4b15-b19d-0a7b9396eae7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 739.542978] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 739.542978] env[61439]: DEBUG nova.compute.manager [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] [instance: a41cb33f-8340-4b15-b19d-0a7b9396eae7] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 739.653590] env[61439]: INFO nova.scheduler.client.report [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleted allocations for instance fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4 [ 739.678703] env[61439]: DEBUG oslo_concurrency.lockutils [None req-929cd172-2443-42b2-9540-44bbb5d2c2bb tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "fda9c4e8-f38d-4b40-8678-5dc49bbf1ca4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.374s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 739.694028] env[61439]: INFO nova.scheduler.client.report [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Deleted allocations for instance a41cb33f-8340-4b15-b19d-0a7b9396eae7 [ 739.724061] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f29cd538-c97a-42cc-81b2-bc5542e57d4e tempest-ServerShowV257Test-1759270808 tempest-ServerShowV257Test-1759270808-project-member] Lock "a41cb33f-8340-4b15-b19d-0a7b9396eae7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 54.488s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 739.912556] env[61439]: DEBUG nova.compute.manager [req-cff81357-a896-40f8-8c3f-451007dd44a6 req-ad37e3a3-3217-4ec4-bbb7-638010d427cf service nova] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Received event network-changed-e6d1e53a-74f0-4da3-8ede-885bbd0c7111 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 739.914012] env[61439]: DEBUG nova.compute.manager [req-cff81357-a896-40f8-8c3f-451007dd44a6 req-ad37e3a3-3217-4ec4-bbb7-638010d427cf service nova] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Refreshing instance network info cache due to event network-changed-e6d1e53a-74f0-4da3-8ede-885bbd0c7111. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 739.914357] env[61439]: DEBUG oslo_concurrency.lockutils [req-cff81357-a896-40f8-8c3f-451007dd44a6 req-ad37e3a3-3217-4ec4-bbb7-638010d427cf service nova] Acquiring lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 739.914522] env[61439]: DEBUG oslo_concurrency.lockutils [req-cff81357-a896-40f8-8c3f-451007dd44a6 req-ad37e3a3-3217-4ec4-bbb7-638010d427cf service nova] Acquired lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 739.914721] env[61439]: DEBUG nova.network.neutron [req-cff81357-a896-40f8-8c3f-451007dd44a6 req-ad37e3a3-3217-4ec4-bbb7-638010d427cf service nova] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Refreshing network info cache for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 740.366780] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 740.367392] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 740.379546] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 740.474151] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 740.474151] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 740.475659] env[61439]: INFO nova.compute.claims [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 740.534693] env[61439]: DEBUG nova.network.neutron [req-cff81357-a896-40f8-8c3f-451007dd44a6 req-ad37e3a3-3217-4ec4-bbb7-638010d427cf service nova] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.575052] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. [ 740.575052] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 740.575052] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.575052] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 740.575052] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.575052] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 740.575052] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.575052] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 740.575052] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.575052] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 740.575052] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.575052] env[61439]: ERROR nova.compute.manager raise self.value [ 740.575052] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.575052] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 740.575052] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.575052] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 740.575547] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.575547] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 740.575547] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. [ 740.575547] env[61439]: ERROR nova.compute.manager [ 740.575547] env[61439]: Traceback (most recent call last): [ 740.575547] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 740.575547] env[61439]: listener.cb(fileno) [ 740.575547] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 740.575547] env[61439]: result = function(*args, **kwargs) [ 740.575547] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 740.575547] env[61439]: return func(*args, **kwargs) [ 740.575547] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 740.575547] env[61439]: raise e [ 740.575547] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.575547] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 740.575547] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.575547] env[61439]: created_port_ids = self._update_ports_for_instance( [ 740.575547] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.575547] env[61439]: with excutils.save_and_reraise_exception(): [ 740.575547] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.575547] env[61439]: self.force_reraise() [ 740.575547] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.575547] env[61439]: raise self.value [ 740.575547] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.575547] env[61439]: updated_port = self._update_port( [ 740.575547] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.575547] env[61439]: _ensure_no_port_binding_failure(port) [ 740.575547] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.575547] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 740.578435] env[61439]: nova.exception.PortBindingFailed: Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. [ 740.578435] env[61439]: Removing descriptor: 10 [ 740.578435] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Traceback (most recent call last): [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] yield resources [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self.driver.spawn(context, instance, image_meta, [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 740.578435] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] vm_ref = self.build_virtual_machine(instance, [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] vif_infos = vmwarevif.get_vif_info(self._session, [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] for vif in network_info: [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] return self._sync_wrapper(fn, *args, **kwargs) [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self.wait() [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self[:] = self._gt.wait() [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] return self._exit_event.wait() [ 740.578889] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] result = hub.switch() [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] return self.greenlet.switch() [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] result = function(*args, **kwargs) [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] return func(*args, **kwargs) [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] raise e [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] nwinfo = self.network_api.allocate_for_instance( [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.579310] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] created_port_ids = self._update_ports_for_instance( [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] with excutils.save_and_reraise_exception(): [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self.force_reraise() [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] raise self.value [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] updated_port = self._update_port( [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] _ensure_no_port_binding_failure(port) [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.579688] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] raise exception.PortBindingFailed(port_id=port['id']) [ 740.580032] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] nova.exception.PortBindingFailed: Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. [ 740.580032] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] [ 740.580032] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Terminating instance [ 740.581163] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 740.581163] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquired lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 740.581163] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 740.696049] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.699643] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-decc37e3-c0d2-43b4-ac63-4f7b1e8cd183 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.703495] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. [ 740.703495] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 740.703495] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.703495] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 740.703495] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.703495] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 740.703495] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.703495] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 740.703495] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.703495] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 740.703495] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.703495] env[61439]: ERROR nova.compute.manager raise self.value [ 740.703495] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.703495] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 740.703495] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.703495] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 740.704073] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.704073] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 740.704073] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. [ 740.704073] env[61439]: ERROR nova.compute.manager [ 740.704623] env[61439]: Traceback (most recent call last): [ 740.704623] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 740.704623] env[61439]: listener.cb(fileno) [ 740.704623] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 740.704623] env[61439]: result = function(*args, **kwargs) [ 740.704623] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 740.704623] env[61439]: return func(*args, **kwargs) [ 740.704623] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 740.704623] env[61439]: raise e [ 740.704623] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.704623] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 740.704623] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.704623] env[61439]: created_port_ids = self._update_ports_for_instance( [ 740.704623] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.704623] env[61439]: with excutils.save_and_reraise_exception(): [ 740.704623] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.704623] env[61439]: self.force_reraise() [ 740.704623] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.704623] env[61439]: raise self.value [ 740.704623] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.704623] env[61439]: updated_port = self._update_port( [ 740.704623] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.704623] env[61439]: _ensure_no_port_binding_failure(port) [ 740.704623] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.704623] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 740.704623] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. [ 740.704623] env[61439]: Removing descriptor: 18 [ 740.705508] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Traceback (most recent call last): [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] yield resources [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self.driver.spawn(context, instance, image_meta, [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] vm_ref = self.build_virtual_machine(instance, [ 740.705508] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] vif_infos = vmwarevif.get_vif_info(self._session, [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] for vif in network_info: [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] return self._sync_wrapper(fn, *args, **kwargs) [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self.wait() [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self[:] = self._gt.wait() [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] return self._exit_event.wait() [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 740.706375] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] result = hub.switch() [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] return self.greenlet.switch() [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] result = function(*args, **kwargs) [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] return func(*args, **kwargs) [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] raise e [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] nwinfo = self.network_api.allocate_for_instance( [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] created_port_ids = self._update_ports_for_instance( [ 740.706917] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] with excutils.save_and_reraise_exception(): [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self.force_reraise() [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] raise self.value [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] updated_port = self._update_port( [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] _ensure_no_port_binding_failure(port) [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] raise exception.PortBindingFailed(port_id=port['id']) [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] nova.exception.PortBindingFailed: Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. [ 740.707767] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] [ 740.708614] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Terminating instance [ 740.708614] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 740.708614] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquired lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 740.708614] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 740.713633] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a20ff16-e7ea-4973-a528-ae8bbcd1f267 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.749983] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d46f000e-32de-41c5-a72f-281878f92b28 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.759508] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0905c27b-13c2-4fc5-95bf-9b5a59eb1cea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.776972] env[61439]: DEBUG nova.compute.provider_tree [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.786743] env[61439]: DEBUG nova.scheduler.client.report [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.806073] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. [ 740.806073] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 740.806073] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.806073] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 740.806073] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.806073] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 740.806073] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.806073] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 740.806073] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.806073] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 740.806073] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.806073] env[61439]: ERROR nova.compute.manager raise self.value [ 740.806073] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.806073] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 740.806073] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.806073] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 740.806588] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.806588] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 740.806588] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. [ 740.806588] env[61439]: ERROR nova.compute.manager [ 740.806588] env[61439]: Traceback (most recent call last): [ 740.806588] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 740.806588] env[61439]: listener.cb(fileno) [ 740.806588] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 740.806588] env[61439]: result = function(*args, **kwargs) [ 740.806588] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 740.806588] env[61439]: return func(*args, **kwargs) [ 740.806588] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 740.806588] env[61439]: raise e [ 740.806588] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.806588] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 740.806588] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.806588] env[61439]: created_port_ids = self._update_ports_for_instance( [ 740.806588] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.806588] env[61439]: with excutils.save_and_reraise_exception(): [ 740.806588] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.806588] env[61439]: self.force_reraise() [ 740.806588] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.806588] env[61439]: raise self.value [ 740.806588] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.806588] env[61439]: updated_port = self._update_port( [ 740.806588] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.806588] env[61439]: _ensure_no_port_binding_failure(port) [ 740.806588] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.806588] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 740.807237] env[61439]: nova.exception.PortBindingFailed: Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. [ 740.807237] env[61439]: Removing descriptor: 23 [ 740.808894] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Traceback (most recent call last): [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] yield resources [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self.driver.spawn(context, instance, image_meta, [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] vm_ref = self.build_virtual_machine(instance, [ 740.808894] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] vif_infos = vmwarevif.get_vif_info(self._session, [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] for vif in network_info: [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] return self._sync_wrapper(fn, *args, **kwargs) [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self.wait() [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self[:] = self._gt.wait() [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] return self._exit_event.wait() [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 740.811570] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] result = hub.switch() [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] return self.greenlet.switch() [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] result = function(*args, **kwargs) [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] return func(*args, **kwargs) [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] raise e [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] nwinfo = self.network_api.allocate_for_instance( [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] created_port_ids = self._update_ports_for_instance( [ 740.812100] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] with excutils.save_and_reraise_exception(): [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self.force_reraise() [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] raise self.value [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] updated_port = self._update_port( [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] _ensure_no_port_binding_failure(port) [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] raise exception.PortBindingFailed(port_id=port['id']) [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] nova.exception.PortBindingFailed: Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. [ 740.812504] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] [ 740.813876] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Terminating instance [ 740.813876] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 740.813876] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 740.815195] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 740.842434] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.868750] env[61439]: DEBUG nova.compute.utils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 740.871536] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 740.872159] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 740.887028] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 740.973585] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 741.007044] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 741.007472] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 741.007721] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 741.008014] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 741.008347] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 741.008678] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 741.009091] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 741.009346] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 741.009608] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 741.009984] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 741.010349] env[61439]: DEBUG nova.virt.hardware [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 741.011692] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ee54d42-9e74-4c04-9d51-fda27c9c9527 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.021272] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d745a39-7eb4-44f3-8f1a-ee424a7c1975 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.403637] env[61439]: DEBUG nova.compute.manager [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Received event network-changed-f7421b70-6d30-4120-be13-e6378ad0ffdb {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 741.403962] env[61439]: DEBUG nova.compute.manager [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Refreshing instance network info cache due to event network-changed-f7421b70-6d30-4120-be13-e6378ad0ffdb. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 741.404172] env[61439]: DEBUG oslo_concurrency.lockutils [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] Acquiring lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 741.476266] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.489714] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Releasing lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 741.490417] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 741.490812] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 741.491312] env[61439]: DEBUG oslo_concurrency.lockutils [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] Acquired lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 741.491747] env[61439]: DEBUG nova.network.neutron [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Refreshing network info cache for port f7421b70-6d30-4120-be13-e6378ad0ffdb {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 741.493456] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-06de7692-2c79-40f0-8b5f-c36b433b3bcd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.509719] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3c056ce-1315-4800-bec5-c9803503a060 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.539597] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2f7123c7-f863-4bb6-a899-5feb618c6ce0 could not be found. [ 741.539918] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 741.540302] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Took 0.05 seconds to destroy the instance on the hypervisor. [ 741.540790] env[61439]: DEBUG oslo.service.loopingcall [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 741.541056] env[61439]: DEBUG nova.compute.manager [-] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 741.541163] env[61439]: DEBUG nova.network.neutron [-] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 741.603558] env[61439]: DEBUG nova.policy [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d0376ef3212459b883f3b757a17316f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c85d469fb8e045f7b6981676c526d780', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 741.609327] env[61439]: DEBUG nova.network.neutron [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.651902] env[61439]: DEBUG nova.network.neutron [-] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.655601] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.669016] env[61439]: DEBUG nova.network.neutron [-] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.671764] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Releasing lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 741.672604] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 741.672985] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 741.674163] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0b2c4ee3-59ae-41a7-b071-48c49e719f06 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.686713] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9a4ff7-6187-48e6-bcb8-cf80f887fa72 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.697939] env[61439]: INFO nova.compute.manager [-] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Took 0.16 seconds to deallocate network for instance. [ 741.700382] env[61439]: DEBUG nova.compute.claims [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 741.700590] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 741.700829] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 741.715027] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 74e5ad9b-1b5b-492d-8642-ed271d8f70e3 could not be found. [ 741.715027] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 741.715027] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 741.715027] env[61439]: DEBUG oslo.service.loopingcall [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 741.715027] env[61439]: DEBUG nova.compute.manager [-] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 741.715275] env[61439]: DEBUG nova.network.neutron [-] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 741.730904] env[61439]: DEBUG nova.network.neutron [req-cff81357-a896-40f8-8c3f-451007dd44a6 req-ad37e3a3-3217-4ec4-bbb7-638010d427cf service nova] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.745602] env[61439]: DEBUG oslo_concurrency.lockutils [req-cff81357-a896-40f8-8c3f-451007dd44a6 req-ad37e3a3-3217-4ec4-bbb7-638010d427cf service nova] Releasing lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 741.746229] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquired lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 741.747071] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 741.816376] env[61439]: DEBUG nova.network.neutron [-] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.831626] env[61439]: DEBUG nova.network.neutron [-] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.839043] env[61439]: INFO nova.compute.manager [-] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Took 0.12 seconds to deallocate network for instance. [ 741.843230] env[61439]: DEBUG nova.compute.claims [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 741.843401] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 741.862034] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.921177] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ffdcbc9-f584-423e-87bb-b38811e7ca3d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.930889] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb80c280-8f54-4f75-9533-3e27535541fc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.965781] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-533ed245-bd53-45fe-a206-781871f85bf7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.975068] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fc3c9b2-3401-4b56-85ec-2d546a8edd12 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.990021] env[61439]: DEBUG nova.compute.provider_tree [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 742.003149] env[61439]: DEBUG nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 742.022698] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.321s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 742.023240] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Traceback (most recent call last): [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self.driver.spawn(context, instance, image_meta, [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] vm_ref = self.build_virtual_machine(instance, [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] vif_infos = vmwarevif.get_vif_info(self._session, [ 742.023240] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] for vif in network_info: [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] return self._sync_wrapper(fn, *args, **kwargs) [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self.wait() [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self[:] = self._gt.wait() [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] return self._exit_event.wait() [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] result = hub.switch() [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 742.023519] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] return self.greenlet.switch() [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] result = function(*args, **kwargs) [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] return func(*args, **kwargs) [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] raise e [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] nwinfo = self.network_api.allocate_for_instance( [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] created_port_ids = self._update_ports_for_instance( [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] with excutils.save_and_reraise_exception(): [ 742.023829] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] self.force_reraise() [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] raise self.value [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] updated_port = self._update_port( [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] _ensure_no_port_binding_failure(port) [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] raise exception.PortBindingFailed(port_id=port['id']) [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] nova.exception.PortBindingFailed: Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. [ 742.024172] env[61439]: ERROR nova.compute.manager [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] [ 742.024469] env[61439]: DEBUG nova.compute.utils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 742.025718] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.182s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.033137] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Build of instance 2f7123c7-f863-4bb6-a899-5feb618c6ce0 was re-scheduled: Binding failed for port f7421b70-6d30-4120-be13-e6378ad0ffdb, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 742.033137] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 742.033137] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 742.122349] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.122349] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.133541] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 742.138024] env[61439]: DEBUG nova.network.neutron [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.151328] env[61439]: DEBUG oslo_concurrency.lockutils [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] Releasing lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 742.152320] env[61439]: DEBUG nova.compute.manager [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Received event network-changed-17e9cb25-c36e-4d26-b65a-e0135ece2194 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 742.152592] env[61439]: DEBUG nova.compute.manager [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Refreshing instance network info cache due to event network-changed-17e9cb25-c36e-4d26-b65a-e0135ece2194. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 742.152859] env[61439]: DEBUG oslo_concurrency.lockutils [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] Acquiring lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 742.153081] env[61439]: DEBUG oslo_concurrency.lockutils [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] Acquired lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 742.153322] env[61439]: DEBUG nova.network.neutron [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Refreshing network info cache for port 17e9cb25-c36e-4d26-b65a-e0135ece2194 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 742.155079] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquired lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 742.155079] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 742.226979] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.241527] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d8c09cc-58ee-492b-b1c5-530dcd233fc9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.245492] env[61439]: DEBUG nova.network.neutron [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 742.253209] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb63518c-ec19-4ed4-879a-9ec0a56343de {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.258717] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 742.297966] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e136bef7-b9c5-46e4-bf69-89666f05647b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.306218] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-934f5f1a-1521-472e-b83e-bafe3a18e1bf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.321429] env[61439]: DEBUG nova.compute.provider_tree [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 742.339140] env[61439]: DEBUG nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 742.368032] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.341s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 742.368032] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. [ 742.368032] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Traceback (most recent call last): [ 742.368032] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 742.368032] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self.driver.spawn(context, instance, image_meta, [ 742.368032] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 742.368032] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 742.368032] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 742.368032] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] vm_ref = self.build_virtual_machine(instance, [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] vif_infos = vmwarevif.get_vif_info(self._session, [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] for vif in network_info: [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] return self._sync_wrapper(fn, *args, **kwargs) [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self.wait() [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self[:] = self._gt.wait() [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] return self._exit_event.wait() [ 742.368376] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] result = hub.switch() [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] return self.greenlet.switch() [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] result = function(*args, **kwargs) [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] return func(*args, **kwargs) [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] raise e [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] nwinfo = self.network_api.allocate_for_instance( [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 742.368683] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] created_port_ids = self._update_ports_for_instance( [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] with excutils.save_and_reraise_exception(): [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] self.force_reraise() [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] raise self.value [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] updated_port = self._update_port( [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] _ensure_no_port_binding_failure(port) [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.368990] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] raise exception.PortBindingFailed(port_id=port['id']) [ 742.369292] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] nova.exception.PortBindingFailed: Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. [ 742.369292] env[61439]: ERROR nova.compute.manager [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] [ 742.369292] env[61439]: DEBUG nova.compute.utils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 742.370935] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Build of instance 74e5ad9b-1b5b-492d-8642-ed271d8f70e3 was re-scheduled: Binding failed for port 17e9cb25-c36e-4d26-b65a-e0135ece2194, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 742.371151] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 742.371490] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 742.371813] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.146s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.373274] env[61439]: INFO nova.compute.claims [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 742.662291] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dfeda84-6ef9-4d0c-97bd-59930eda1717 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.673520] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7430e786-2079-48b0-b085-042fe595bec8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.707229] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daf79a31-7144-46f2-9537-5bbc9a21ac47 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.715614] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e2c6bcd-261f-4ade-adfd-fa56bac6192c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.732640] env[61439]: DEBUG nova.compute.provider_tree [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 742.746696] env[61439]: DEBUG nova.scheduler.client.report [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 742.766670] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.394s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 742.767371] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 742.822101] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquiring lock "42ca8a89-5938-491b-b122-deac71d18505" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.822213] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Lock "42ca8a89-5938-491b-b122-deac71d18505" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.824248] env[61439]: DEBUG nova.compute.utils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 742.832176] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 742.832874] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 742.836048] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 742.840083] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 742.873037] env[61439]: ERROR nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. [ 742.873037] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 742.873037] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 742.873037] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 742.873037] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 742.873037] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 742.873037] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 742.873037] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 742.873037] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.873037] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 742.873037] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.873037] env[61439]: ERROR nova.compute.manager raise self.value [ 742.873037] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 742.873037] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 742.873037] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.873037] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 742.873526] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.873526] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 742.873526] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. [ 742.873526] env[61439]: ERROR nova.compute.manager [ 742.873526] env[61439]: Traceback (most recent call last): [ 742.873526] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 742.873526] env[61439]: listener.cb(fileno) [ 742.873526] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 742.873526] env[61439]: result = function(*args, **kwargs) [ 742.873526] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 742.873526] env[61439]: return func(*args, **kwargs) [ 742.873526] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 742.873526] env[61439]: raise e [ 742.873526] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 742.873526] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 742.873526] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 742.873526] env[61439]: created_port_ids = self._update_ports_for_instance( [ 742.873526] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 742.873526] env[61439]: with excutils.save_and_reraise_exception(): [ 742.873526] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.873526] env[61439]: self.force_reraise() [ 742.873526] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.873526] env[61439]: raise self.value [ 742.873526] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 742.873526] env[61439]: updated_port = self._update_port( [ 742.873526] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.873526] env[61439]: _ensure_no_port_binding_failure(port) [ 742.873526] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.873526] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 742.874227] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. [ 742.874227] env[61439]: Removing descriptor: 20 [ 742.874227] env[61439]: ERROR nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Traceback (most recent call last): [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] yield resources [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self.driver.spawn(context, instance, image_meta, [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self._vmops.spawn(context, instance, image_meta, injected_files, [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 742.874227] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] vm_ref = self.build_virtual_machine(instance, [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] vif_infos = vmwarevif.get_vif_info(self._session, [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] for vif in network_info: [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] return self._sync_wrapper(fn, *args, **kwargs) [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self.wait() [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self[:] = self._gt.wait() [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] return self._exit_event.wait() [ 742.874520] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] result = hub.switch() [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] return self.greenlet.switch() [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] result = function(*args, **kwargs) [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] return func(*args, **kwargs) [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] raise e [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] nwinfo = self.network_api.allocate_for_instance( [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 742.874822] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] created_port_ids = self._update_ports_for_instance( [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] with excutils.save_and_reraise_exception(): [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self.force_reraise() [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] raise self.value [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] updated_port = self._update_port( [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] _ensure_no_port_binding_failure(port) [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 742.875253] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] raise exception.PortBindingFailed(port_id=port['id']) [ 742.875563] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] nova.exception.PortBindingFailed: Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. [ 742.875563] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] [ 742.875563] env[61439]: INFO nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Terminating instance [ 742.879127] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Acquiring lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 742.879295] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Acquired lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 742.879470] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 742.928961] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.928961] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.929657] env[61439]: INFO nova.compute.claims [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 742.941597] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 742.969706] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 742.970205] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 742.970205] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 742.970391] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 742.970509] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 742.970685] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 742.971038] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 742.971865] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 742.972068] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 742.972251] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 742.972451] env[61439]: DEBUG nova.virt.hardware [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 742.973384] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a890707-298b-4ef2-95c5-7bb8e350bff3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 742.983822] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3838ff7b-7315-4cff-8427-915e9b45fe55 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.021775] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.071239] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.086106] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Releasing lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 743.086528] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 743.086712] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 743.087293] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-56af98e3-f0c8-44f3-9f06-77bb9ee1aa98 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.097783] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0836c2c4-12a6-458c-80ea-c5b2f0aa1b54 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.111734] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.132536] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d79dac42-38fa-401b-9864-5fbdf80b89ec could not be found. [ 743.132858] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 743.133080] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Took 0.05 seconds to destroy the instance on the hypervisor. [ 743.133382] env[61439]: DEBUG oslo.service.loopingcall [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 743.133824] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Releasing lock "refresh_cache-2f7123c7-f863-4bb6-a899-5feb618c6ce0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 743.134033] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 743.134207] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 743.134367] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 743.138648] env[61439]: DEBUG nova.compute.manager [-] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 743.138742] env[61439]: DEBUG nova.network.neutron [-] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 743.213146] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c8562f6-ecd9-4df0-98c8-881a6b7dc145 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.216742] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.223543] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b242476-4eb5-47cf-bd9a-3c3b120a27b7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.227789] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.261322] env[61439]: DEBUG nova.policy [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b861ada4972f4431b0b9bd46ae21f7cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16074166244d449b99488fc24f4f3d74', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 743.263696] env[61439]: DEBUG nova.network.neutron [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.268184] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 2f7123c7-f863-4bb6-a899-5feb618c6ce0] Took 0.13 seconds to deallocate network for instance. [ 743.268288] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74bfc1e9-8694-414c-b557-21059e0b87fa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.274920] env[61439]: DEBUG oslo_concurrency.lockutils [req-939a81f3-fd90-4ed0-ad34-3324e14cad05 req-9adbeb70-5006-43d2-998e-8ee28883c347 service nova] Releasing lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 743.277425] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquired lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 743.277702] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 743.279487] env[61439]: DEBUG nova.network.neutron [-] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.281488] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48f15cb1-87d9-414f-974b-059ceba90631 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.300532] env[61439]: DEBUG nova.compute.provider_tree [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.301648] env[61439]: DEBUG nova.network.neutron [-] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.314462] env[61439]: DEBUG nova.scheduler.client.report [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.331505] env[61439]: INFO nova.compute.manager [-] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Took 0.19 seconds to deallocate network for instance. [ 743.336172] env[61439]: DEBUG nova.compute.claims [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 743.336172] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 743.344646] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 743.344893] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 743.347794] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.012s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 743.403641] env[61439]: DEBUG nova.compute.utils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 743.403641] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Not allocating networking since 'none' was specified. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 743.422225] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 743.425602] env[61439]: INFO nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Deleted allocations for instance 2f7123c7-f863-4bb6-a899-5feb618c6ce0 [ 743.453073] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.517371] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "2f7123c7-f863-4bb6-a899-5feb618c6ce0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.460s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 743.568786] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 743.615016] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 743.615016] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 743.615016] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 743.615351] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 743.615351] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 743.615351] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 743.615351] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 743.615351] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 743.615489] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 743.615489] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 743.615489] env[61439]: DEBUG nova.virt.hardware [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 743.615581] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbaa2b49-2280-4391-8883-02917cab01dc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.623216] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daafea09-e882-4d0d-a40a-b01d5d89cdc9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.635452] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d95ef0e9-8c60-4779-b3b3-d853fd5ee90f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.640032] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f727ddc-eb45-442e-ba98-334480c4b087 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.655600] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Instance VIF info [] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 743.662158] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Creating folder: Project (660363d19fc745f6b616c24f0477a904). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 743.689121] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bfab79ae-b1c6-407b-9b24-25625ff77887 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.691114] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23046acd-ee68-4e8c-bf7a-2d10ebed8469 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.699541] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c32d9e9-a9f6-43f3-9748-79af0d80f38d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.706459] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Created folder: Project (660363d19fc745f6b616c24f0477a904) in parent group-v221281. [ 743.706459] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Creating folder: Instances. Parent ref: group-v221300. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 743.710022] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cb58aedb-5d43-4a62-9e88-708ecf2e097c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.716799] env[61439]: DEBUG nova.compute.provider_tree [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.722908] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Created folder: Instances in parent group-v221300. [ 743.722908] env[61439]: DEBUG oslo.service.loopingcall [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 743.722908] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 743.722908] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2076279f-cc48-4742-9b79-0d5f55abb039 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.734893] env[61439]: DEBUG nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.744981] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 743.744981] env[61439]: value = "task-987675" [ 743.744981] env[61439]: _type = "Task" [ 743.744981] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 743.754163] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987675, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 743.755759] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.408s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 743.756410] env[61439]: ERROR nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Traceback (most recent call last): [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self.driver.spawn(context, instance, image_meta, [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] vm_ref = self.build_virtual_machine(instance, [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] vif_infos = vmwarevif.get_vif_info(self._session, [ 743.756410] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] for vif in network_info: [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] return self._sync_wrapper(fn, *args, **kwargs) [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self.wait() [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self[:] = self._gt.wait() [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] return self._exit_event.wait() [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] result = hub.switch() [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 743.756772] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] return self.greenlet.switch() [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] result = function(*args, **kwargs) [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] return func(*args, **kwargs) [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] raise e [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] nwinfo = self.network_api.allocate_for_instance( [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] created_port_ids = self._update_ports_for_instance( [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] with excutils.save_and_reraise_exception(): [ 743.757079] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] self.force_reraise() [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] raise self.value [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] updated_port = self._update_port( [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] _ensure_no_port_binding_failure(port) [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] raise exception.PortBindingFailed(port_id=port['id']) [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] nova.exception.PortBindingFailed: Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. [ 743.757383] env[61439]: ERROR nova.compute.manager [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] [ 743.757675] env[61439]: DEBUG nova.compute.utils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 743.758621] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Build of instance d79dac42-38fa-401b-9864-5fbdf80b89ec was re-scheduled: Binding failed for port e6d1e53a-74f0-4da3-8ede-885bbd0c7111, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 743.759064] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 743.759303] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquiring lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 743.759455] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Acquired lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 743.759613] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 743.803274] env[61439]: DEBUG nova.compute.manager [req-8166316e-1a93-4f43-88b0-84cc011f6832 req-3b1c54b3-c500-4f60-9a5e-ee1b733edba0 service nova] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Received event network-changed-905d20ea-65b5-4d7a-a080-9a1bfb47fb42 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 743.803483] env[61439]: DEBUG nova.compute.manager [req-8166316e-1a93-4f43-88b0-84cc011f6832 req-3b1c54b3-c500-4f60-9a5e-ee1b733edba0 service nova] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Refreshing instance network info cache due to event network-changed-905d20ea-65b5-4d7a-a080-9a1bfb47fb42. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 743.803706] env[61439]: DEBUG oslo_concurrency.lockutils [req-8166316e-1a93-4f43-88b0-84cc011f6832 req-3b1c54b3-c500-4f60-9a5e-ee1b733edba0 service nova] Acquiring lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.081091] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.152455] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.172239] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Releasing lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 744.172920] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 744.173558] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 744.173840] env[61439]: DEBUG oslo_concurrency.lockutils [req-8166316e-1a93-4f43-88b0-84cc011f6832 req-3b1c54b3-c500-4f60-9a5e-ee1b733edba0 service nova] Acquired lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 744.176035] env[61439]: DEBUG nova.network.neutron [req-8166316e-1a93-4f43-88b0-84cc011f6832 req-3b1c54b3-c500-4f60-9a5e-ee1b733edba0 service nova] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Refreshing network info cache for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 744.177182] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6298a039-2b33-43d0-a7bc-1bf3e67db3b7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.198858] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9edbde9-d3be-4d1e-9d21-91b03d687164 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.228172] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2398e0a0-f13e-47e7-b735-906694ea4d58 could not be found. [ 744.228172] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 744.228172] env[61439]: INFO nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Took 0.06 seconds to destroy the instance on the hypervisor. [ 744.228411] env[61439]: DEBUG oslo.service.loopingcall [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 744.228946] env[61439]: DEBUG nova.compute.manager [-] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 744.229048] env[61439]: DEBUG nova.network.neutron [-] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 744.257320] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987675, 'name': CreateVM_Task, 'duration_secs': 0.35062} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 744.257573] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 744.258053] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.258216] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 744.259025] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 744.259285] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-40b573a1-9575-499b-aa63-86329721c9c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.266285] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Waiting for the task: (returnval){ [ 744.266285] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c0acb8-f5d9-9da4-88d0-7f7f6c5528ad" [ 744.266285] env[61439]: _type = "Task" [ 744.266285] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 744.276243] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c0acb8-f5d9-9da4-88d0-7f7f6c5528ad, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 744.359467] env[61439]: DEBUG nova.network.neutron [-] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.373881] env[61439]: DEBUG nova.network.neutron [-] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.400202] env[61439]: INFO nova.compute.manager [-] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Took 0.17 seconds to deallocate network for instance. [ 744.403628] env[61439]: DEBUG nova.compute.claims [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 744.404159] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 744.404718] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 744.496716] env[61439]: DEBUG nova.network.neutron [req-8166316e-1a93-4f43-88b0-84cc011f6832 req-3b1c54b3-c500-4f60-9a5e-ee1b733edba0 service nova] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.683445] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f807eb9d-0832-476f-9956-1fb51ccb9f2f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.698499] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cde3e79f-70e4-46e4-9949-060fc43373a4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.746775] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.748767] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a735494-6a00-46e1-9875-0346eb3278ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.758099] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12fb8be5-c9c8-4671-9c9b-8d4a540cf7ca {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.766028] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Releasing lock "refresh_cache-74e5ad9b-1b5b-492d-8642-ed271d8f70e3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 744.766028] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 744.766028] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 744.766028] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 744.776742] env[61439]: DEBUG nova.compute.provider_tree [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.790474] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 744.790731] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 744.791013] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.798501] env[61439]: DEBUG nova.scheduler.client.report [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.828995] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.424s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 744.829300] env[61439]: ERROR nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Traceback (most recent call last): [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self.driver.spawn(context, instance, image_meta, [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self._vmops.spawn(context, instance, image_meta, injected_files, [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] vm_ref = self.build_virtual_machine(instance, [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] vif_infos = vmwarevif.get_vif_info(self._session, [ 744.829300] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] for vif in network_info: [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] return self._sync_wrapper(fn, *args, **kwargs) [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self.wait() [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self[:] = self._gt.wait() [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] return self._exit_event.wait() [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] result = hub.switch() [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 744.829582] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] return self.greenlet.switch() [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] result = function(*args, **kwargs) [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] return func(*args, **kwargs) [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] raise e [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] nwinfo = self.network_api.allocate_for_instance( [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] created_port_ids = self._update_ports_for_instance( [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] with excutils.save_and_reraise_exception(): [ 744.829861] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] self.force_reraise() [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] raise self.value [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] updated_port = self._update_port( [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] _ensure_no_port_binding_failure(port) [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] raise exception.PortBindingFailed(port_id=port['id']) [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] nova.exception.PortBindingFailed: Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. [ 744.830151] env[61439]: ERROR nova.compute.manager [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] [ 744.830464] env[61439]: DEBUG nova.compute.utils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 744.836168] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Build of instance 2398e0a0-f13e-47e7-b735-906694ea4d58 was re-scheduled: Binding failed for port 905d20ea-65b5-4d7a-a080-9a1bfb47fb42, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 744.836168] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 744.836336] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Acquiring lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.936012] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.945290] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.947720] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.959039] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: 74e5ad9b-1b5b-492d-8642-ed271d8f70e3] Took 0.19 seconds to deallocate network for instance. [ 744.965754] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Releasing lock "refresh_cache-d79dac42-38fa-401b-9864-5fbdf80b89ec" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 744.965987] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 744.966184] env[61439]: DEBUG nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 744.966347] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 745.089475] env[61439]: INFO nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Deleted allocations for instance 74e5ad9b-1b5b-492d-8642-ed271d8f70e3 [ 745.103441] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Successfully created port: 690df5ae-3e80-4edb-ac87-8f8780fda17c {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 745.118675] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "74e5ad9b-1b5b-492d-8642-ed271d8f70e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.028s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 745.322460] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 745.332862] env[61439]: DEBUG nova.network.neutron [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.350686] env[61439]: INFO nova.compute.manager [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] [instance: d79dac42-38fa-401b-9864-5fbdf80b89ec] Took 0.38 seconds to deallocate network for instance. [ 745.508393] env[61439]: INFO nova.scheduler.client.report [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Deleted allocations for instance d79dac42-38fa-401b-9864-5fbdf80b89ec [ 745.547302] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c52a553a-2135-4689-9e47-2284985004d9 tempest-ListServersNegativeTestJSON-979928406 tempest-ListServersNegativeTestJSON-979928406-project-member] Lock "d79dac42-38fa-401b-9864-5fbdf80b89ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.421s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 745.698571] env[61439]: DEBUG nova.network.neutron [req-8166316e-1a93-4f43-88b0-84cc011f6832 req-3b1c54b3-c500-4f60-9a5e-ee1b733edba0 service nova] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.720842] env[61439]: DEBUG oslo_concurrency.lockutils [req-8166316e-1a93-4f43-88b0-84cc011f6832 req-3b1c54b3-c500-4f60-9a5e-ee1b733edba0 service nova] Releasing lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 745.720842] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Acquired lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 745.720842] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 745.849627] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 745.893123] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "375d8f80-430a-4856-9b83-33c4aa945a57" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 745.893123] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "375d8f80-430a-4856-9b83-33c4aa945a57" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 745.907899] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 745.979038] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 745.979297] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 745.981338] env[61439]: INFO nova.compute.claims [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 746.249377] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d06c42c4-d784-4c09-a627-0d0a7192112f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.262323] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60e3c5e4-df50-41cf-ab73-b3cc88347d15 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.301225] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff331518-aaff-41ee-9b56-9d56b029cf8d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.309879] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fd9d43f-acb7-4c52-a6d3-752a6b51e985 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.331511] env[61439]: DEBUG nova.compute.provider_tree [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 746.342936] env[61439]: DEBUG nova.scheduler.client.report [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 746.361205] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.382s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 746.361738] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 746.377980] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Successfully created port: 6e291bb6-0f87-4960-b168-42bfea1ff96f {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 746.412687] env[61439]: DEBUG nova.compute.utils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 746.414067] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 746.414245] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 746.427320] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 746.509785] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.522720] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Releasing lock "refresh_cache-2398e0a0-f13e-47e7-b735-906694ea4d58" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 746.522982] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 746.523232] env[61439]: DEBUG nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 746.523373] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 746.538568] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 746.576336] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 746.576546] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 746.576891] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 746.576891] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 746.577150] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 746.577323] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 746.577536] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 746.577696] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 746.577855] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 746.578117] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 746.578202] env[61439]: DEBUG nova.virt.hardware [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 746.579077] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1ef47d1-d0f2-41c0-901d-5eaf60cde085 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.591020] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8f4664b-23ce-4f1f-90a2-0650d1ed7b3c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.800404] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 746.811163] env[61439]: DEBUG nova.network.neutron [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.824730] env[61439]: INFO nova.compute.manager [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] [instance: 2398e0a0-f13e-47e7-b735-906694ea4d58] Took 0.30 seconds to deallocate network for instance. [ 746.953110] env[61439]: INFO nova.scheduler.client.report [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Deleted allocations for instance 2398e0a0-f13e-47e7-b735-906694ea4d58 [ 746.979081] env[61439]: DEBUG oslo_concurrency.lockutils [None req-61da2cdd-73df-4cd8-8666-6efb976502ee tempest-ServerActionsTestOtherB-1936320080 tempest-ServerActionsTestOtherB-1936320080-project-member] Lock "2398e0a0-f13e-47e7-b735-906694ea4d58" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 18.028s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 747.122270] env[61439]: DEBUG nova.policy [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e64ca57e567146098521cd7356b9e3e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e1db803f0ff4f29bb70e0a0d94c57e0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 747.307258] env[61439]: ERROR nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. [ 747.307258] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 747.307258] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 747.307258] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 747.307258] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 747.307258] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 747.307258] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 747.307258] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 747.307258] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 747.307258] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 747.307258] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 747.307258] env[61439]: ERROR nova.compute.manager raise self.value [ 747.307258] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 747.307258] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 747.307258] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 747.307258] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 747.307711] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 747.307711] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 747.307711] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. [ 747.307711] env[61439]: ERROR nova.compute.manager [ 747.307711] env[61439]: Traceback (most recent call last): [ 747.307711] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 747.307711] env[61439]: listener.cb(fileno) [ 747.307711] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 747.307711] env[61439]: result = function(*args, **kwargs) [ 747.307711] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 747.307711] env[61439]: return func(*args, **kwargs) [ 747.307711] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 747.307711] env[61439]: raise e [ 747.307711] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 747.307711] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 747.307711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 747.307711] env[61439]: created_port_ids = self._update_ports_for_instance( [ 747.307711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 747.307711] env[61439]: with excutils.save_and_reraise_exception(): [ 747.307711] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 747.307711] env[61439]: self.force_reraise() [ 747.307711] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 747.307711] env[61439]: raise self.value [ 747.307711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 747.307711] env[61439]: updated_port = self._update_port( [ 747.307711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 747.307711] env[61439]: _ensure_no_port_binding_failure(port) [ 747.307711] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 747.307711] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 747.308376] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. [ 747.308376] env[61439]: Removing descriptor: 24 [ 747.308376] env[61439]: ERROR nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Traceback (most recent call last): [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] yield resources [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self.driver.spawn(context, instance, image_meta, [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self._vmops.spawn(context, instance, image_meta, injected_files, [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 747.308376] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] vm_ref = self.build_virtual_machine(instance, [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] vif_infos = vmwarevif.get_vif_info(self._session, [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] for vif in network_info: [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] return self._sync_wrapper(fn, *args, **kwargs) [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self.wait() [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self[:] = self._gt.wait() [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] return self._exit_event.wait() [ 747.308667] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] result = hub.switch() [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] return self.greenlet.switch() [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] result = function(*args, **kwargs) [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] return func(*args, **kwargs) [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] raise e [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] nwinfo = self.network_api.allocate_for_instance( [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 747.308982] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] created_port_ids = self._update_ports_for_instance( [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] with excutils.save_and_reraise_exception(): [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self.force_reraise() [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] raise self.value [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] updated_port = self._update_port( [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] _ensure_no_port_binding_failure(port) [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 747.309346] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] raise exception.PortBindingFailed(port_id=port['id']) [ 747.309636] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] nova.exception.PortBindingFailed: Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. [ 747.309636] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] [ 747.309636] env[61439]: INFO nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Terminating instance [ 747.310920] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Acquiring lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 747.310920] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Acquired lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 747.311160] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 747.430094] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.338647] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.353959] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Releasing lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 748.354599] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 748.355063] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 748.356618] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-88a2a8bc-4e03-48d5-8c70-b18f020f14b0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.370772] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94746884-6bd3-4a9b-811e-e524ab9ef577 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.404637] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance abdb8903-7574-40a4-ac7e-345b22fe1141 could not be found. [ 748.404940] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 748.405282] env[61439]: INFO nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Took 0.05 seconds to destroy the instance on the hypervisor. [ 748.405600] env[61439]: DEBUG oslo.service.loopingcall [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 748.405904] env[61439]: DEBUG nova.compute.manager [-] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 748.406051] env[61439]: DEBUG nova.network.neutron [-] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 748.437695] env[61439]: DEBUG nova.compute.manager [req-bfb4e688-3944-4d2e-ab4a-4489e23008aa req-e5785ab8-c91e-4bdf-9abf-9834b14e02bb service nova] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Received event network-changed-55d06218-1ea4-47f7-81e5-59f167d3ea24 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 748.437949] env[61439]: DEBUG nova.compute.manager [req-bfb4e688-3944-4d2e-ab4a-4489e23008aa req-e5785ab8-c91e-4bdf-9abf-9834b14e02bb service nova] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Refreshing instance network info cache due to event network-changed-55d06218-1ea4-47f7-81e5-59f167d3ea24. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 748.438241] env[61439]: DEBUG oslo_concurrency.lockutils [req-bfb4e688-3944-4d2e-ab4a-4489e23008aa req-e5785ab8-c91e-4bdf-9abf-9834b14e02bb service nova] Acquiring lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 748.438408] env[61439]: DEBUG oslo_concurrency.lockutils [req-bfb4e688-3944-4d2e-ab4a-4489e23008aa req-e5785ab8-c91e-4bdf-9abf-9834b14e02bb service nova] Acquired lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 748.439591] env[61439]: DEBUG nova.network.neutron [req-bfb4e688-3944-4d2e-ab4a-4489e23008aa req-e5785ab8-c91e-4bdf-9abf-9834b14e02bb service nova] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Refreshing network info cache for port 55d06218-1ea4-47f7-81e5-59f167d3ea24 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 748.553483] env[61439]: DEBUG nova.network.neutron [-] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.567173] env[61439]: DEBUG nova.network.neutron [-] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.579036] env[61439]: INFO nova.compute.manager [-] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Took 0.17 seconds to deallocate network for instance. [ 748.581774] env[61439]: DEBUG nova.compute.claims [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 748.581952] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 748.582194] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 748.593148] env[61439]: DEBUG nova.network.neutron [req-bfb4e688-3944-4d2e-ab4a-4489e23008aa req-e5785ab8-c91e-4bdf-9abf-9834b14e02bb service nova] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.807760] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a21c415d-abdc-4ca4-ad0d-9aca9b6058da {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.816790] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2e90e8f-b171-49c8-a845-53d669eb4b74 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.848740] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b2da7b-1e79-46d6-9a46-6101df38d4bf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.857291] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d80590b0-1d83-46d4-a1b6-7315ed01248e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.873460] env[61439]: DEBUG nova.compute.provider_tree [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 748.884581] env[61439]: DEBUG nova.scheduler.client.report [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 748.902517] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.320s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 748.902988] env[61439]: ERROR nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Traceback (most recent call last): [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self.driver.spawn(context, instance, image_meta, [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self._vmops.spawn(context, instance, image_meta, injected_files, [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] vm_ref = self.build_virtual_machine(instance, [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] vif_infos = vmwarevif.get_vif_info(self._session, [ 748.902988] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] for vif in network_info: [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] return self._sync_wrapper(fn, *args, **kwargs) [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self.wait() [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self[:] = self._gt.wait() [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] return self._exit_event.wait() [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] result = hub.switch() [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 748.903408] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] return self.greenlet.switch() [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] result = function(*args, **kwargs) [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] return func(*args, **kwargs) [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] raise e [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] nwinfo = self.network_api.allocate_for_instance( [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] created_port_ids = self._update_ports_for_instance( [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] with excutils.save_and_reraise_exception(): [ 748.903723] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] self.force_reraise() [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] raise self.value [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] updated_port = self._update_port( [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] _ensure_no_port_binding_failure(port) [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] raise exception.PortBindingFailed(port_id=port['id']) [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] nova.exception.PortBindingFailed: Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. [ 748.904049] env[61439]: ERROR nova.compute.manager [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] [ 748.904320] env[61439]: DEBUG nova.compute.utils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 748.906315] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Build of instance abdb8903-7574-40a4-ac7e-345b22fe1141 was re-scheduled: Binding failed for port 55d06218-1ea4-47f7-81e5-59f167d3ea24, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 748.906739] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 748.908313] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Acquiring lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 749.456968] env[61439]: DEBUG nova.network.neutron [req-bfb4e688-3944-4d2e-ab4a-4489e23008aa req-e5785ab8-c91e-4bdf-9abf-9834b14e02bb service nova] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.470864] env[61439]: DEBUG oslo_concurrency.lockutils [req-bfb4e688-3944-4d2e-ab4a-4489e23008aa req-e5785ab8-c91e-4bdf-9abf-9834b14e02bb service nova] Releasing lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 749.472171] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Acquired lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 749.472536] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 749.568923] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.120504] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Successfully created port: 09684df6-9453-4d3e-be24-48120b3abdab {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 750.522208] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.541898] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Releasing lock "refresh_cache-abdb8903-7574-40a4-ac7e-345b22fe1141" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 750.541898] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 750.541898] env[61439]: DEBUG nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 750.541898] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 750.648646] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.658927] env[61439]: DEBUG nova.network.neutron [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.675297] env[61439]: INFO nova.compute.manager [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] [instance: abdb8903-7574-40a4-ac7e-345b22fe1141] Took 0.13 seconds to deallocate network for instance. [ 750.792188] env[61439]: INFO nova.scheduler.client.report [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Deleted allocations for instance abdb8903-7574-40a4-ac7e-345b22fe1141 [ 750.821469] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0c5ab126-e328-4542-864e-db8263bec43d tempest-ServerMetadataNegativeTestJSON-1006936469 tempest-ServerMetadataNegativeTestJSON-1006936469-project-member] Lock "abdb8903-7574-40a4-ac7e-345b22fe1141" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.927s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 751.825417] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "09300ec8-8a0e-4447-a32e-dd232e74fc53" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.825744] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "09300ec8-8a0e-4447-a32e-dd232e74fc53" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 751.841026] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 751.914164] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.914447] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 751.915934] env[61439]: INFO nova.compute.claims [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 752.126466] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c3c5b6d-25c1-45d0-bc5c-21830ba0811d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.136959] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5036a40-7ddc-4101-9301-911ff2d0782b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.179043] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecd68ea4-8a2d-4452-b4b4-88bc3d00db0b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.186946] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fa03b2c-dc6c-460d-b345-cd5803977fe8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.207500] env[61439]: DEBUG nova.compute.provider_tree [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.218133] env[61439]: DEBUG nova.scheduler.client.report [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 752.244363] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 752.244909] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 752.301631] env[61439]: DEBUG nova.compute.utils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 752.303157] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 752.303848] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 752.321371] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 752.400305] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 752.428364] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 752.429180] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 752.429180] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 752.429368] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 752.429621] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 752.430318] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 752.430557] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 752.430753] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 752.430954] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 752.431179] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 752.431361] env[61439]: DEBUG nova.virt.hardware [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 752.432541] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7e7ddaa-9708-4158-8b6e-e2f7f1553c0e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.441619] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d1dff34-6e33-446a-975a-b2bd62a9a9f5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.874252] env[61439]: DEBUG nova.policy [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af2fd8431af45ca891f744f4d10b54f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca364a2df93a424f8b66ee39d9b0b120', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 753.934612] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquiring lock "bf9101c9-4072-4f72-8ac3-24b7a5b88b45" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 753.937015] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Lock "bf9101c9-4072-4f72-8ac3-24b7a5b88b45" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 753.952240] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 754.034062] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 754.035508] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 754.035708] env[61439]: INFO nova.compute.claims [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 754.275789] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e06f087c-e61c-4a66-92be-f06cf73118f8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.286595] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f98021d-3b28-41a5-9ae2-c1daff2b3304 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.327009] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3ffdfa9-cedc-4e30-bfd8-2df3f090ede4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.338109] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c735176-e6ad-48a8-ba66-1ecc5486bb7d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.352052] env[61439]: DEBUG nova.compute.provider_tree [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 754.367875] env[61439]: DEBUG nova.scheduler.client.report [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 754.390454] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 754.390958] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 754.433862] env[61439]: DEBUG nova.compute.utils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 754.437118] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Not allocating networking since 'none' was specified. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 754.454797] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 754.559502] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 754.586965] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 754.587223] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 754.587425] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 754.589430] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 754.589628] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 754.589788] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 754.590014] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 754.590193] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 754.590795] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 754.591541] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 754.591541] env[61439]: DEBUG nova.virt.hardware [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 754.591961] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee35789c-8907-41dc-b73b-d71c75d246e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.605443] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8d9b50a-fa7c-424b-87c9-956a96763acd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.619939] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Instance VIF info [] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 754.626960] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Creating folder: Project (5d52efc6161d484bbe7f70640caa646b). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 754.626960] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-72302568-e6f3-4bf0-abb7-a50b1cd289f5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.638315] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Created folder: Project (5d52efc6161d484bbe7f70640caa646b) in parent group-v221281. [ 754.638485] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Creating folder: Instances. Parent ref: group-v221303. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 754.638718] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-83375dde-72fc-4934-a65e-24099adb39bd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.649373] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Created folder: Instances in parent group-v221303. [ 754.649373] env[61439]: DEBUG oslo.service.loopingcall [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 754.649373] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 754.649373] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bd14d092-d15a-4afa-a3eb-09ab4b08fcf4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.669312] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 754.669312] env[61439]: value = "task-987681" [ 754.669312] env[61439]: _type = "Task" [ 754.669312] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 754.677917] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987681, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 755.184923] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987681, 'name': CreateVM_Task, 'duration_secs': 0.29954} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 755.185830] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 755.186430] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 755.187116] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 755.187525] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 755.190486] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-729f3d99-417f-4193-b091-119d512e9264 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.195916] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Waiting for the task: (returnval){ [ 755.195916] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52456217-b604-cb0b-3cb9-0a8ae4721ebc" [ 755.195916] env[61439]: _type = "Task" [ 755.195916] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 755.208439] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52456217-b604-cb0b-3cb9-0a8ae4721ebc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 755.420770] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Successfully created port: 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 755.708534] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 755.708830] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 755.709059] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 755.762306] env[61439]: ERROR nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. [ 755.762306] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 755.762306] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 755.762306] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 755.762306] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.762306] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 755.762306] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.762306] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 755.762306] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.762306] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 755.762306] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.762306] env[61439]: ERROR nova.compute.manager raise self.value [ 755.762306] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.762306] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 755.762306] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.762306] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 755.762765] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.762765] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 755.762765] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. [ 755.762765] env[61439]: ERROR nova.compute.manager [ 755.762765] env[61439]: Traceback (most recent call last): [ 755.762765] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 755.762765] env[61439]: listener.cb(fileno) [ 755.762765] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 755.762765] env[61439]: result = function(*args, **kwargs) [ 755.762765] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 755.762765] env[61439]: return func(*args, **kwargs) [ 755.762765] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 755.762765] env[61439]: raise e [ 755.762765] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 755.762765] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 755.762765] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.762765] env[61439]: created_port_ids = self._update_ports_for_instance( [ 755.762765] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.762765] env[61439]: with excutils.save_and_reraise_exception(): [ 755.762765] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.762765] env[61439]: self.force_reraise() [ 755.762765] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.762765] env[61439]: raise self.value [ 755.762765] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.762765] env[61439]: updated_port = self._update_port( [ 755.762765] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.762765] env[61439]: _ensure_no_port_binding_failure(port) [ 755.762765] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.762765] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 755.763567] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. [ 755.763567] env[61439]: Removing descriptor: 23 [ 755.763567] env[61439]: ERROR nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Traceback (most recent call last): [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] yield resources [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self.driver.spawn(context, instance, image_meta, [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 755.763567] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] vm_ref = self.build_virtual_machine(instance, [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] vif_infos = vmwarevif.get_vif_info(self._session, [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] for vif in network_info: [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] return self._sync_wrapper(fn, *args, **kwargs) [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self.wait() [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self[:] = self._gt.wait() [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] return self._exit_event.wait() [ 755.763866] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] result = hub.switch() [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] return self.greenlet.switch() [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] result = function(*args, **kwargs) [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] return func(*args, **kwargs) [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] raise e [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] nwinfo = self.network_api.allocate_for_instance( [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.764196] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] created_port_ids = self._update_ports_for_instance( [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] with excutils.save_and_reraise_exception(): [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self.force_reraise() [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] raise self.value [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] updated_port = self._update_port( [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] _ensure_no_port_binding_failure(port) [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.764488] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] raise exception.PortBindingFailed(port_id=port['id']) [ 755.764768] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] nova.exception.PortBindingFailed: Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. [ 755.764768] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] [ 755.764768] env[61439]: INFO nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Terminating instance [ 755.767162] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "refresh_cache-5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 755.767341] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquired lock "refresh_cache-5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 755.768473] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.905130] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.856464] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.872616] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Releasing lock "refresh_cache-5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 756.873051] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 756.873519] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 756.874118] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8d935f22-0cfa-4c29-a7cd-361372ac907d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.887909] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-976cce67-87c2-4bac-abd2-a17487d0ac68 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.923182] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6 could not be found. [ 756.923501] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 756.925092] env[61439]: INFO nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Took 0.05 seconds to destroy the instance on the hypervisor. [ 756.925092] env[61439]: DEBUG oslo.service.loopingcall [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 756.925338] env[61439]: DEBUG nova.compute.manager [-] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 756.925456] env[61439]: DEBUG nova.network.neutron [-] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 756.995503] env[61439]: ERROR nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. [ 756.995503] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 756.995503] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 756.995503] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 756.995503] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.995503] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 756.995503] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.995503] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 756.995503] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.995503] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 756.995503] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.995503] env[61439]: ERROR nova.compute.manager raise self.value [ 756.995503] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.995503] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 756.995503] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.995503] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 756.996955] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.996955] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 756.996955] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. [ 756.996955] env[61439]: ERROR nova.compute.manager [ 756.996955] env[61439]: Traceback (most recent call last): [ 756.996955] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 756.996955] env[61439]: listener.cb(fileno) [ 756.996955] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 756.996955] env[61439]: result = function(*args, **kwargs) [ 756.996955] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 756.996955] env[61439]: return func(*args, **kwargs) [ 756.996955] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 756.996955] env[61439]: raise e [ 756.996955] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 756.996955] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 756.996955] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.996955] env[61439]: created_port_ids = self._update_ports_for_instance( [ 756.996955] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.996955] env[61439]: with excutils.save_and_reraise_exception(): [ 756.996955] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.996955] env[61439]: self.force_reraise() [ 756.996955] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.996955] env[61439]: raise self.value [ 756.996955] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.996955] env[61439]: updated_port = self._update_port( [ 756.996955] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.996955] env[61439]: _ensure_no_port_binding_failure(port) [ 756.996955] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.996955] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 756.997761] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. [ 756.997761] env[61439]: Removing descriptor: 22 [ 756.997761] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Acquiring lock "f6fba9ce-77a3-49cd-b52f-30d52e884a5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 756.997761] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Lock "f6fba9ce-77a3-49cd-b52f-30d52e884a5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 756.999646] env[61439]: ERROR nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Traceback (most recent call last): [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] yield resources [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self.driver.spawn(context, instance, image_meta, [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] vm_ref = self.build_virtual_machine(instance, [ 756.999646] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] vif_infos = vmwarevif.get_vif_info(self._session, [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] for vif in network_info: [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] return self._sync_wrapper(fn, *args, **kwargs) [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self.wait() [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self[:] = self._gt.wait() [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] return self._exit_event.wait() [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 757.000193] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] result = hub.switch() [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] return self.greenlet.switch() [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] result = function(*args, **kwargs) [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] return func(*args, **kwargs) [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] raise e [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] nwinfo = self.network_api.allocate_for_instance( [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] created_port_ids = self._update_ports_for_instance( [ 757.000679] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] with excutils.save_and_reraise_exception(): [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self.force_reraise() [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] raise self.value [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] updated_port = self._update_port( [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] _ensure_no_port_binding_failure(port) [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] raise exception.PortBindingFailed(port_id=port['id']) [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] nova.exception.PortBindingFailed: Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. [ 757.001094] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] [ 757.002269] env[61439]: INFO nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Terminating instance [ 757.779303] env[61439]: DEBUG nova.network.neutron [-] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.783986] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 757.785790] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 757.785900] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 757.786099] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 757.797714] env[61439]: DEBUG nova.network.neutron [-] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.810835] env[61439]: INFO nova.compute.manager [-] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Took 0.89 seconds to deallocate network for instance. [ 757.813856] env[61439]: DEBUG nova.compute.claims [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 757.813856] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 757.813856] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 757.856126] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.871386] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 758.025180] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59f1ebc0-6bca-49e8-b013-086b7b7f328d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.032358] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-831c2bdd-5d22-43fd-8c92-11695293db1e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.067403] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a47840a8-b492-4443-a481-e22bd0b470b1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.078190] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ddeb8ae-0513-49c8-a75e-e08960a1dace {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.095284] env[61439]: DEBUG nova.compute.provider_tree [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 758.107711] env[61439]: DEBUG nova.scheduler.client.report [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 758.128116] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.314s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 758.128726] env[61439]: ERROR nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Traceback (most recent call last): [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self.driver.spawn(context, instance, image_meta, [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] vm_ref = self.build_virtual_machine(instance, [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] vif_infos = vmwarevif.get_vif_info(self._session, [ 758.128726] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] for vif in network_info: [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] return self._sync_wrapper(fn, *args, **kwargs) [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self.wait() [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self[:] = self._gt.wait() [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] return self._exit_event.wait() [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] result = hub.switch() [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 758.129048] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] return self.greenlet.switch() [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] result = function(*args, **kwargs) [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] return func(*args, **kwargs) [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] raise e [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] nwinfo = self.network_api.allocate_for_instance( [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] created_port_ids = self._update_ports_for_instance( [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] with excutils.save_and_reraise_exception(): [ 758.129419] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] self.force_reraise() [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] raise self.value [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] updated_port = self._update_port( [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] _ensure_no_port_binding_failure(port) [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] raise exception.PortBindingFailed(port_id=port['id']) [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] nova.exception.PortBindingFailed: Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. [ 758.129758] env[61439]: ERROR nova.compute.manager [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] [ 758.130457] env[61439]: DEBUG nova.compute.utils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 758.132489] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.262s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 758.134271] env[61439]: INFO nova.compute.claims [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 758.137861] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Build of instance 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6 was re-scheduled: Binding failed for port 690df5ae-3e80-4edb-ac87-8f8780fda17c, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 758.140868] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 758.140868] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquiring lock "refresh_cache-5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 758.140868] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Acquired lock "refresh_cache-5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 758.140868] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 758.235235] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 758.385559] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.390748] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f620106-b842-411c-a4b5-9be236f83c13 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.398896] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d86036d0-ce76-4eb6-9540-2cc91149c601 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.405078] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 758.406212] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 758.406212] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 758.406840] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9d8bfc93-9a74-45f1-b824-da09ae992d50 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.439318] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c8a5a0c-db21-4e43-a419-c8d66cbedf21 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.447162] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fff9459-f0d3-4c77-921a-4d0839395c1d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.463233] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41b62b7e-c1e4-421e-a94e-7cb78747c711 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.486609] env[61439]: DEBUG nova.compute.provider_tree [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 758.488851] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de could not be found. [ 758.488851] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 758.488851] env[61439]: INFO nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Took 0.08 seconds to destroy the instance on the hypervisor. [ 758.488851] env[61439]: DEBUG oslo.service.loopingcall [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 758.489260] env[61439]: DEBUG nova.compute.manager [-] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 758.489353] env[61439]: DEBUG nova.network.neutron [-] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 758.499697] env[61439]: DEBUG nova.scheduler.client.report [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 758.527135] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.393s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 758.527135] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 758.578823] env[61439]: DEBUG nova.network.neutron [-] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 758.593556] env[61439]: DEBUG nova.network.neutron [-] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.607293] env[61439]: DEBUG nova.compute.utils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 758.608789] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 758.610037] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 758.625462] env[61439]: INFO nova.compute.manager [-] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Took 0.14 seconds to deallocate network for instance. [ 758.625462] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 758.629735] env[61439]: DEBUG nova.compute.claims [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 758.629912] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 758.630142] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 758.675905] env[61439]: INFO nova.virt.block_device [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Booting with volume d91ce8e0-5482-4bf8-9362-01c6b89272ee at /dev/sda [ 758.735469] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-310010c6-3bb0-4ffb-bf32-2efb07b1e35b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.749873] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba5cd00d-c4fc-4aeb-bed5-44c800d24d0b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.784187] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-eaf0318a-78dc-4ff9-adbe-aaaad1c11351 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.795066] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddb5b4be-8b4d-45bd-b7e4-61036a9a5f45 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.812049] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Acquiring lock "559acc57-5718-41bc-aa69-a8ca3272b28f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 758.814511] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Lock "559acc57-5718-41bc-aa69-a8ca3272b28f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 758.828528] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67cc916d-5fbe-4cdd-8655-b2aa89d49d38 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.831349] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 758.842168] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdf7107a-f491-4977-bfab-55c88ed1b6bd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.855728] env[61439]: DEBUG nova.virt.block_device [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Updating existing volume attachment record: c1d9d6e1-6b40-4fc7-bdd5-e222e4772ee4 {{(pid=61439) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 758.904228] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 758.906685] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9529b837-81b1-4213-b755-e9ad644a7b44 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.914424] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0887222-970e-446d-bd9a-94250a76520e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.944869] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fec3581b-144c-4183-926a-0a68638bf9b3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.953597] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e526ec42-db6b-40df-bba9-b95775aace9a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.967281] env[61439]: DEBUG nova.compute.provider_tree [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 758.981229] env[61439]: DEBUG nova.scheduler.client.report [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 758.995763] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.365s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 758.996396] env[61439]: ERROR nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Traceback (most recent call last): [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self.driver.spawn(context, instance, image_meta, [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] vm_ref = self.build_virtual_machine(instance, [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] vif_infos = vmwarevif.get_vif_info(self._session, [ 758.996396] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] for vif in network_info: [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] return self._sync_wrapper(fn, *args, **kwargs) [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self.wait() [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self[:] = self._gt.wait() [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] return self._exit_event.wait() [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] result = hub.switch() [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 758.996715] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] return self.greenlet.switch() [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] result = function(*args, **kwargs) [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] return func(*args, **kwargs) [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] raise e [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] nwinfo = self.network_api.allocate_for_instance( [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] created_port_ids = self._update_ports_for_instance( [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] with excutils.save_and_reraise_exception(): [ 758.997043] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] self.force_reraise() [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] raise self.value [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] updated_port = self._update_port( [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] _ensure_no_port_binding_failure(port) [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] raise exception.PortBindingFailed(port_id=port['id']) [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] nova.exception.PortBindingFailed: Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. [ 758.997349] env[61439]: ERROR nova.compute.manager [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] [ 758.997611] env[61439]: DEBUG nova.compute.utils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 758.998633] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.094s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 758.999661] env[61439]: INFO nova.compute.claims [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 759.003473] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Build of instance 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de was re-scheduled: Binding failed for port 6e291bb6-0f87-4960-b168-42bfea1ff96f, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 759.003528] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 759.003718] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 759.003863] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 759.004029] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 759.077666] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.080779] env[61439]: DEBUG nova.policy [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '696ddf29d56e4b25820d33a408aa7397', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abaf03622e1f42f485b74180094a7db2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 759.110337] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 759.110870] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 759.111109] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 759.111332] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 759.111451] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 759.111592] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 759.111740] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 759.111942] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 759.112128] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 759.112392] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 759.112488] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 759.112660] env[61439]: DEBUG nova.virt.hardware [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 759.113791] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-539cb484-0998-46aa-ac08-45401d93a79f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.123311] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-968eabb2-dd9d-4858-a786-92d3ef1d6a70 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.219069] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.228540] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2aeb8fb6-3024-4624-a004-1c38ed7975db {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.232166] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Releasing lock "refresh_cache-5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 759.232568] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 759.232801] env[61439]: DEBUG nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 759.232984] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 759.239853] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b9112a7-bb2b-44b4-9d28-8a3f197123ed {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.271107] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65afded1-48d3-4b5c-ba50-89f858a646ec {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.278983] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-247080ee-1a4c-41cc-a55e-fb78f36f3b10 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.295456] env[61439]: DEBUG nova.compute.provider_tree [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 759.305070] env[61439]: DEBUG nova.scheduler.client.report [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 759.310587] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.323138] env[61439]: DEBUG nova.network.neutron [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.324653] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 759.325586] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 759.332094] env[61439]: INFO nova.compute.manager [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] [instance: 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6] Took 0.10 seconds to deallocate network for instance. [ 759.372884] env[61439]: DEBUG nova.compute.utils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 759.374612] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 759.374693] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 759.388188] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 759.457576] env[61439]: INFO nova.scheduler.client.report [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Deleted allocations for instance 5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6 [ 759.494404] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 759.501821] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4b15fd02-16e9-4c7d-8ad4-9d16fc7cc909 tempest-SecurityGroupsTestJSON-969051472 tempest-SecurityGroupsTestJSON-969051472-project-member] Lock "5bc95fc5-0d0d-4678-b2e6-1b7ca0e4b3d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.134s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 759.531613] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 759.531857] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 759.532022] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 759.532231] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 759.532455] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 759.532615] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 759.532973] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 759.533061] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 759.533368] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 759.533606] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 759.534385] env[61439]: DEBUG nova.virt.hardware [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 759.534904] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3d41132-72c8-4b0d-95ba-40732c115ff3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.545210] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-730ba7cf-aabe-40fe-a66d-91755a04c736 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.641263] env[61439]: DEBUG nova.policy [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f74f71de02b944e08a69a23f7fbfb0c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7fd75a17bdcf4008a04e758009616713', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 759.735364] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.747616] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 759.747711] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 759.747873] env[61439]: DEBUG nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 759.748050] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 759.814613] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.823015] env[61439]: DEBUG nova.network.neutron [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.834112] env[61439]: INFO nova.compute.manager [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de] Took 0.09 seconds to deallocate network for instance. [ 759.853221] env[61439]: ERROR nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. [ 759.853221] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 759.853221] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 759.853221] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 759.853221] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.853221] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 759.853221] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.853221] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 759.853221] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.853221] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 759.853221] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.853221] env[61439]: ERROR nova.compute.manager raise self.value [ 759.853221] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.853221] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 759.853221] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.853221] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 759.853732] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.853732] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 759.853732] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. [ 759.853732] env[61439]: ERROR nova.compute.manager [ 759.853732] env[61439]: Traceback (most recent call last): [ 759.853732] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 759.853732] env[61439]: listener.cb(fileno) [ 759.853732] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 759.853732] env[61439]: result = function(*args, **kwargs) [ 759.853732] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 759.853732] env[61439]: return func(*args, **kwargs) [ 759.853732] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 759.853732] env[61439]: raise e [ 759.853732] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 759.853732] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 759.853732] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.853732] env[61439]: created_port_ids = self._update_ports_for_instance( [ 759.853732] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.853732] env[61439]: with excutils.save_and_reraise_exception(): [ 759.853732] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.853732] env[61439]: self.force_reraise() [ 759.853732] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.853732] env[61439]: raise self.value [ 759.853732] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.853732] env[61439]: updated_port = self._update_port( [ 759.853732] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.853732] env[61439]: _ensure_no_port_binding_failure(port) [ 759.853732] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.853732] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 759.854576] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. [ 759.854576] env[61439]: Removing descriptor: 10 [ 759.854576] env[61439]: ERROR nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Traceback (most recent call last): [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] yield resources [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self.driver.spawn(context, instance, image_meta, [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self._vmops.spawn(context, instance, image_meta, injected_files, [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 759.854576] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] vm_ref = self.build_virtual_machine(instance, [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] vif_infos = vmwarevif.get_vif_info(self._session, [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] for vif in network_info: [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] return self._sync_wrapper(fn, *args, **kwargs) [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self.wait() [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self[:] = self._gt.wait() [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] return self._exit_event.wait() [ 759.855154] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] result = hub.switch() [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] return self.greenlet.switch() [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] result = function(*args, **kwargs) [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] return func(*args, **kwargs) [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] raise e [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] nwinfo = self.network_api.allocate_for_instance( [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.855612] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] created_port_ids = self._update_ports_for_instance( [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] with excutils.save_and_reraise_exception(): [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self.force_reraise() [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] raise self.value [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] updated_port = self._update_port( [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] _ensure_no_port_binding_failure(port) [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.856090] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] raise exception.PortBindingFailed(port_id=port['id']) [ 759.856421] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] nova.exception.PortBindingFailed: Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. [ 759.856421] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] [ 759.856421] env[61439]: INFO nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Terminating instance [ 759.857328] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "refresh_cache-375d8f80-430a-4856-9b83-33c4aa945a57" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 759.857328] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquired lock "refresh_cache-375d8f80-430a-4856-9b83-33c4aa945a57" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 759.857328] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 759.964394] env[61439]: INFO nova.scheduler.client.report [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Deleted allocations for instance 5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de [ 759.991049] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82624f7b-68fd-458c-b5b5-b21356765a18 tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "5188eb5e-7273-4fa9-bfe4-b1ec2bcdb3de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.871s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 760.172081] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.729958] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.751620] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Releasing lock "refresh_cache-375d8f80-430a-4856-9b83-33c4aa945a57" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 760.752119] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 760.752345] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 760.753182] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4ae2937a-d526-456f-83ef-9eaddfe81ed3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.773332] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea310720-c291-4af0-859a-591747465555 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.812965] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 375d8f80-430a-4856-9b83-33c4aa945a57 could not be found. [ 760.813556] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 760.813923] env[61439]: INFO nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Took 0.06 seconds to destroy the instance on the hypervisor. [ 760.815095] env[61439]: DEBUG oslo.service.loopingcall [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 760.816270] env[61439]: DEBUG nova.compute.manager [-] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 760.816494] env[61439]: DEBUG nova.network.neutron [-] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 760.822574] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Acquiring lock "5f05e360-888d-41c3-87ec-e4838dc5a2f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 760.822930] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Lock "5f05e360-888d-41c3-87ec-e4838dc5a2f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 760.841059] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 760.914836] env[61439]: DEBUG nova.network.neutron [-] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.927544] env[61439]: DEBUG nova.network.neutron [-] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.937177] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 760.937540] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 760.939429] env[61439]: INFO nova.compute.claims [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 760.943219] env[61439]: INFO nova.compute.manager [-] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Took 0.13 seconds to deallocate network for instance. [ 760.946525] env[61439]: DEBUG nova.compute.claims [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 760.946525] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 761.205768] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3ae3067-4fb3-4b38-a798-36d6379040d4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.214288] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-024fe3d1-39aa-459c-a27a-92f3e7638a21 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.257418] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-779d012e-3b59-4651-9691-03c66b965b34 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.266289] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98aa3725-aaa0-4e91-8835-3a5bb4e80d73 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.280711] env[61439]: DEBUG nova.compute.provider_tree [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 761.294063] env[61439]: DEBUG nova.scheduler.client.report [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 761.314350] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 761.314866] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 761.321260] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.372s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 761.361067] env[61439]: DEBUG nova.compute.utils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 761.362959] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 761.363154] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 761.378562] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 761.487503] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 761.526529] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 761.526907] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 761.527189] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 761.527873] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 761.528176] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 761.528368] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 761.529071] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 761.529362] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 761.530225] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 761.534016] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 761.534016] env[61439]: DEBUG nova.virt.hardware [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 761.534201] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d66dc968-8154-4793-b412-0c7333469312 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.550951] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53ffc03e-498f-4c62-b2d6-0aa2b2974a56 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.582980] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebbb2fa8-9d45-40d1-839e-e9bc3662af44 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.590774] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c53e2fa7-a893-4be4-a6b0-e258504f0e16 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.624137] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16065eb1-8fda-40e6-8c48-7ec02a469abf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.631917] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-117a8b65-b628-48ce-bace-5381bd7ac1b2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.646628] env[61439]: DEBUG nova.compute.provider_tree [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 761.661234] env[61439]: DEBUG nova.scheduler.client.report [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 761.667604] env[61439]: DEBUG nova.policy [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ba2092a98474cd4bbd66180b95d3329', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1f9b592f20764458954202cad4dcc1d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 761.685248] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.368s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 761.685870] env[61439]: ERROR nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Traceback (most recent call last): [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self.driver.spawn(context, instance, image_meta, [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self._vmops.spawn(context, instance, image_meta, injected_files, [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] vm_ref = self.build_virtual_machine(instance, [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] vif_infos = vmwarevif.get_vif_info(self._session, [ 761.685870] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] for vif in network_info: [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] return self._sync_wrapper(fn, *args, **kwargs) [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self.wait() [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self[:] = self._gt.wait() [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] return self._exit_event.wait() [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] result = hub.switch() [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 761.686210] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] return self.greenlet.switch() [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] result = function(*args, **kwargs) [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] return func(*args, **kwargs) [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] raise e [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] nwinfo = self.network_api.allocate_for_instance( [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] created_port_ids = self._update_ports_for_instance( [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] with excutils.save_and_reraise_exception(): [ 761.686510] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] self.force_reraise() [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] raise self.value [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] updated_port = self._update_port( [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] _ensure_no_port_binding_failure(port) [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] raise exception.PortBindingFailed(port_id=port['id']) [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] nova.exception.PortBindingFailed: Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. [ 761.686814] env[61439]: ERROR nova.compute.manager [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] [ 761.687211] env[61439]: DEBUG nova.compute.utils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 761.688477] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Build of instance 375d8f80-430a-4856-9b83-33c4aa945a57 was re-scheduled: Binding failed for port 09684df6-9453-4d3e-be24-48120b3abdab, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 761.688889] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 761.689134] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquiring lock "refresh_cache-375d8f80-430a-4856-9b83-33c4aa945a57" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 761.689281] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Acquired lock "refresh_cache-375d8f80-430a-4856-9b83-33c4aa945a57" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 761.689437] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 761.788088] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.815441] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Successfully created port: abf3dd49-7566-41b8-9aec-33cee02d4bca {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 762.107121] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Successfully created port: e094db18-1c04-4a5c-b1b3-977f24a0d11f {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 762.692200] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.709742] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Releasing lock "refresh_cache-375d8f80-430a-4856-9b83-33c4aa945a57" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 762.709742] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 762.709742] env[61439]: DEBUG nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 762.709851] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.798283] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.809302] env[61439]: DEBUG nova.network.neutron [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.822404] env[61439]: INFO nova.compute.manager [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] [instance: 375d8f80-430a-4856-9b83-33c4aa945a57] Took 0.11 seconds to deallocate network for instance. [ 762.952581] env[61439]: INFO nova.scheduler.client.report [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Deleted allocations for instance 375d8f80-430a-4856-9b83-33c4aa945a57 [ 762.983254] env[61439]: DEBUG oslo_concurrency.lockutils [None req-90f653f5-0e51-46a4-9290-ad527dce5eed tempest-AttachInterfacesTestJSON-210578950 tempest-AttachInterfacesTestJSON-210578950-project-member] Lock "375d8f80-430a-4856-9b83-33c4aa945a57" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.090s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 763.236037] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "45a24553-3e31-4187-81fe-e9d1c8ca9353" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 763.236037] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "45a24553-3e31-4187-81fe-e9d1c8ca9353" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.245441] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 763.315779] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 763.316375] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.318868] env[61439]: INFO nova.compute.claims [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 763.393843] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Acquiring lock "0c167874-85c7-45bf-b296-971cdae1fe6a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 763.395577] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Lock "0c167874-85c7-45bf-b296-971cdae1fe6a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.409565] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 763.494055] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 763.598020] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e47abbc-b649-4900-a946-79387c834f64 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.610022] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91f7a991-e074-44e4-9f53-41161e0d339c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.653455] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-309f74f4-82ae-424e-bcda-895ac5b06cea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.659580] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cabdba4-bce2-40e4-99e0-e4bb5f10d45c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.675567] env[61439]: DEBUG nova.compute.provider_tree [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 763.692947] env[61439]: DEBUG nova.scheduler.client.report [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 763.733546] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.417s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 763.733546] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 763.737717] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.243s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.740070] env[61439]: INFO nova.compute.claims [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 763.802103] env[61439]: DEBUG nova.compute.utils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 763.802103] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 763.802103] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 763.813954] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 763.904498] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 763.938033] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 763.938234] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 763.938397] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 763.938584] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 763.938730] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 763.939145] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 763.939418] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 763.939728] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 763.939793] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 763.939906] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 763.940090] env[61439]: DEBUG nova.virt.hardware [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 763.941689] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb06f593-61bf-4baa-9a28-e5715bc5c325 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.952073] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-464749db-5b7b-458d-a152-343cf17ee96b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.036476] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Successfully created port: aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 764.040649] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcf71685-2951-43d7-8de1-dd7d64ac5929 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.048835] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f46819a-9f0c-4870-9464-3f74cca299f3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.092169] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74664396-e423-4c00-b4f0-efd21f622c59 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.101608] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57d756fd-bc42-46e4-9f51-c377709b60c4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.117544] env[61439]: DEBUG nova.compute.provider_tree [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 764.130848] env[61439]: DEBUG nova.scheduler.client.report [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 764.141143] env[61439]: DEBUG nova.policy [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c468b82bd9d64e19b419a393fff4af06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f86a86563cc047459d3e7c0553c82c63', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 764.145105] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.408s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 764.145594] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 764.191489] env[61439]: DEBUG nova.compute.utils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 764.193097] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 764.193596] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 764.207085] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 764.292024] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 764.323717] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 764.324871] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 764.324871] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 764.324871] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 764.324871] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 764.324871] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 764.325087] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 764.325087] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 764.326568] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 764.326568] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 764.326568] env[61439]: DEBUG nova.virt.hardware [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 764.326568] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82e95b9f-4567-43d6-83f7-090889868a3c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.338159] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99af522a-e416-448f-97ce-2fcc57cc5ca5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.424336] env[61439]: ERROR nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. [ 764.424336] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 764.424336] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 764.424336] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 764.424336] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 764.424336] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 764.424336] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 764.424336] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 764.424336] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 764.424336] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 764.424336] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 764.424336] env[61439]: ERROR nova.compute.manager raise self.value [ 764.424336] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 764.424336] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 764.424336] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 764.424336] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 764.424841] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 764.424841] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 764.424841] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. [ 764.424841] env[61439]: ERROR nova.compute.manager [ 764.424841] env[61439]: Traceback (most recent call last): [ 764.424841] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 764.424841] env[61439]: listener.cb(fileno) [ 764.424841] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 764.424841] env[61439]: result = function(*args, **kwargs) [ 764.424841] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 764.424841] env[61439]: return func(*args, **kwargs) [ 764.424841] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 764.424841] env[61439]: raise e [ 764.424841] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 764.424841] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 764.424841] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 764.424841] env[61439]: created_port_ids = self._update_ports_for_instance( [ 764.424841] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 764.424841] env[61439]: with excutils.save_and_reraise_exception(): [ 764.424841] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 764.424841] env[61439]: self.force_reraise() [ 764.424841] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 764.424841] env[61439]: raise self.value [ 764.424841] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 764.424841] env[61439]: updated_port = self._update_port( [ 764.424841] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 764.424841] env[61439]: _ensure_no_port_binding_failure(port) [ 764.424841] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 764.424841] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 764.427514] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. [ 764.427514] env[61439]: Removing descriptor: 20 [ 764.427514] env[61439]: ERROR nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Traceback (most recent call last): [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] yield resources [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self.driver.spawn(context, instance, image_meta, [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 764.427514] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] vm_ref = self.build_virtual_machine(instance, [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] vif_infos = vmwarevif.get_vif_info(self._session, [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] for vif in network_info: [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] return self._sync_wrapper(fn, *args, **kwargs) [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self.wait() [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self[:] = self._gt.wait() [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] return self._exit_event.wait() [ 764.427803] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] result = hub.switch() [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] return self.greenlet.switch() [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] result = function(*args, **kwargs) [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] return func(*args, **kwargs) [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] raise e [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] nwinfo = self.network_api.allocate_for_instance( [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 764.428125] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] created_port_ids = self._update_ports_for_instance( [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] with excutils.save_and_reraise_exception(): [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self.force_reraise() [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] raise self.value [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] updated_port = self._update_port( [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] _ensure_no_port_binding_failure(port) [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 764.428460] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] raise exception.PortBindingFailed(port_id=port['id']) [ 764.428738] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] nova.exception.PortBindingFailed: Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. [ 764.428738] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] [ 764.428738] env[61439]: INFO nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Terminating instance [ 764.428738] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-09300ec8-8a0e-4447-a32e-dd232e74fc53" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 764.428738] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-09300ec8-8a0e-4447-a32e-dd232e74fc53" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 764.431061] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 764.495273] env[61439]: DEBUG nova.policy [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '165cc88b32b74a0b9c4e62301b61da68', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0665c37087ef4f0abae09ed7fe33efae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 764.757721] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.402120] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.414361] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-09300ec8-8a0e-4447-a32e-dd232e74fc53" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 765.414833] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 765.415129] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 765.415584] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-23f3d1b8-7cad-4bc0-b4df-93792cd19ce4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.428148] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff834fd7-5281-4a64-a58d-4c7f51c47b9d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.460671] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 09300ec8-8a0e-4447-a32e-dd232e74fc53 could not be found. [ 765.460905] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 765.461179] env[61439]: INFO nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Took 0.05 seconds to destroy the instance on the hypervisor. [ 765.461407] env[61439]: DEBUG oslo.service.loopingcall [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 765.461629] env[61439]: DEBUG nova.compute.manager [-] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 765.461701] env[61439]: DEBUG nova.network.neutron [-] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 765.561494] env[61439]: DEBUG nova.network.neutron [-] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.575435] env[61439]: DEBUG nova.network.neutron [-] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.588282] env[61439]: INFO nova.compute.manager [-] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Took 0.13 seconds to deallocate network for instance. [ 765.596027] env[61439]: DEBUG nova.compute.claims [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 765.596027] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 765.596027] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 765.848029] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fff0e852-5e91-414a-b57e-6cfc21efed4d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.856614] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c50a55c-fe5e-428e-bfc2-cca467020841 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.890196] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e83b2344-c061-49c6-9141-f49bd097b29a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.898205] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dfc3cdb-f0a4-4154-b17b-d33f652f15ce {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.912890] env[61439]: DEBUG nova.compute.provider_tree [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 765.927258] env[61439]: DEBUG nova.scheduler.client.report [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 765.952878] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.359s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 765.953595] env[61439]: ERROR nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Traceback (most recent call last): [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self.driver.spawn(context, instance, image_meta, [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] vm_ref = self.build_virtual_machine(instance, [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] vif_infos = vmwarevif.get_vif_info(self._session, [ 765.953595] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] for vif in network_info: [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] return self._sync_wrapper(fn, *args, **kwargs) [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self.wait() [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self[:] = self._gt.wait() [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] return self._exit_event.wait() [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] result = hub.switch() [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 765.954134] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] return self.greenlet.switch() [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] result = function(*args, **kwargs) [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] return func(*args, **kwargs) [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] raise e [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] nwinfo = self.network_api.allocate_for_instance( [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] created_port_ids = self._update_ports_for_instance( [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] with excutils.save_and_reraise_exception(): [ 765.954792] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] self.force_reraise() [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] raise self.value [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] updated_port = self._update_port( [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] _ensure_no_port_binding_failure(port) [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] raise exception.PortBindingFailed(port_id=port['id']) [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] nova.exception.PortBindingFailed: Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. [ 765.955681] env[61439]: ERROR nova.compute.manager [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] [ 765.956477] env[61439]: DEBUG nova.compute.utils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 765.956477] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Build of instance 09300ec8-8a0e-4447-a32e-dd232e74fc53 was re-scheduled: Binding failed for port 1e9f8b2d-ce5e-4702-8c66-69e4a3ab9a16, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 765.956477] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 765.956866] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-09300ec8-8a0e-4447-a32e-dd232e74fc53" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 765.956866] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-09300ec8-8a0e-4447-a32e-dd232e74fc53" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 765.957249] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 766.034926] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 766.529425] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Successfully created port: be1fb9f7-258b-4aca-b899-f1818b72b1f4 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 766.792838] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 766.807267] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-09300ec8-8a0e-4447-a32e-dd232e74fc53" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 766.807267] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 766.807267] env[61439]: DEBUG nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 766.807267] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 767.103163] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 767.119798] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Successfully created port: d855f0dd-5b33-4e2d-8570-9f285221250a {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 767.124367] env[61439]: DEBUG nova.network.neutron [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 767.143371] env[61439]: INFO nova.compute.manager [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 09300ec8-8a0e-4447-a32e-dd232e74fc53] Took 0.34 seconds to deallocate network for instance. [ 767.289195] env[61439]: INFO nova.scheduler.client.report [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleted allocations for instance 09300ec8-8a0e-4447-a32e-dd232e74fc53 [ 767.311592] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ae0b0106-6902-4dcd-acf8-d9157e0ec902 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "09300ec8-8a0e-4447-a32e-dd232e74fc53" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.486s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 769.202537] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 769.202806] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 769.202843] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 769.226357] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.226554] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4527b287-d099-443c-a424-185d02054be0] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.226689] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.226815] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.227467] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.227628] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.227761] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.227888] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.228292] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 769.228449] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 769.229232] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 770.201551] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 770.201803] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 770.218011] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 770.218272] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 770.218307] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 770.218458] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 770.219722] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-750dcbe4-31af-4351-abcb-f32c6234df7e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.231561] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f170886c-b855-4156-aa56-ea594d9eee80 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.248258] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc2fbad6-a77a-42cb-bc15-46624ce4ff63 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.257557] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0815199-232f-440e-adee-192cc6667f65 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.297508] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181571MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 770.297679] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 770.297931] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 770.388514] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 8fe2bccd-5b46-4067-b72b-bdbf726c0155 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.389397] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4527b287-d099-443c-a424-185d02054be0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.389397] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 42ca8a89-5938-491b-b122-deac71d18505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.389397] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance bf9101c9-4072-4f72-8ac3-24b7a5b88b45 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.389549] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance f6fba9ce-77a3-49cd-b52f-30d52e884a5b actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.389670] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 559acc57-5718-41bc-aa69-a8ca3272b28f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.390078] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5f05e360-888d-41c3-87ec-e4838dc5a2f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.390226] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 45a24553-3e31-4187-81fe-e9d1c8ca9353 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.390435] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 0c167874-85c7-45bf-b296-971cdae1fe6a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 770.390811] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 770.391019] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 770.547478] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c47a6b6-28ef-4659-b020-73fd31a71c24 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.556412] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eb2a524-fbc8-48e0-936b-cc5f85f93d1a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.593404] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f0888d9-4662-454c-b855-d291159b725b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.601348] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69b075bc-0f5a-46b7-9864-68c69e3926c8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.615732] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 770.625809] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 770.642291] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 770.642414] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.344s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 770.977903] env[61439]: ERROR nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. [ 770.977903] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 770.977903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 770.977903] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 770.977903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 770.977903] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 770.977903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 770.977903] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 770.977903] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 770.977903] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 770.977903] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 770.977903] env[61439]: ERROR nova.compute.manager raise self.value [ 770.977903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 770.977903] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 770.977903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 770.977903] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 770.978425] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 770.978425] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 770.978425] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. [ 770.978425] env[61439]: ERROR nova.compute.manager [ 770.978425] env[61439]: Traceback (most recent call last): [ 770.978425] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 770.978425] env[61439]: listener.cb(fileno) [ 770.978425] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 770.978425] env[61439]: result = function(*args, **kwargs) [ 770.978425] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 770.978425] env[61439]: return func(*args, **kwargs) [ 770.978425] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 770.978425] env[61439]: raise e [ 770.978425] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 770.978425] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 770.978425] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 770.978425] env[61439]: created_port_ids = self._update_ports_for_instance( [ 770.978425] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 770.978425] env[61439]: with excutils.save_and_reraise_exception(): [ 770.978425] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 770.978425] env[61439]: self.force_reraise() [ 770.978425] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 770.978425] env[61439]: raise self.value [ 770.978425] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 770.978425] env[61439]: updated_port = self._update_port( [ 770.978425] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 770.978425] env[61439]: _ensure_no_port_binding_failure(port) [ 770.978425] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 770.978425] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 770.979543] env[61439]: nova.exception.PortBindingFailed: Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. [ 770.979543] env[61439]: Removing descriptor: 23 [ 770.979543] env[61439]: ERROR nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Traceback (most recent call last): [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] yield resources [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self.driver.spawn(context, instance, image_meta, [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 770.979543] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] vm_ref = self.build_virtual_machine(instance, [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] vif_infos = vmwarevif.get_vif_info(self._session, [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] for vif in network_info: [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] return self._sync_wrapper(fn, *args, **kwargs) [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self.wait() [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self[:] = self._gt.wait() [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] return self._exit_event.wait() [ 770.980281] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] result = hub.switch() [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] return self.greenlet.switch() [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] result = function(*args, **kwargs) [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] return func(*args, **kwargs) [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] raise e [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] nwinfo = self.network_api.allocate_for_instance( [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 770.981161] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] created_port_ids = self._update_ports_for_instance( [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] with excutils.save_and_reraise_exception(): [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self.force_reraise() [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] raise self.value [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] updated_port = self._update_port( [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] _ensure_no_port_binding_failure(port) [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 770.981887] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] raise exception.PortBindingFailed(port_id=port['id']) [ 770.983237] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] nova.exception.PortBindingFailed: Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. [ 770.983237] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] [ 770.983237] env[61439]: INFO nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Terminating instance [ 770.983237] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Acquiring lock "refresh_cache-f6fba9ce-77a3-49cd-b52f-30d52e884a5b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 770.983237] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Acquired lock "refresh_cache-f6fba9ce-77a3-49cd-b52f-30d52e884a5b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 770.983237] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 771.063541] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 771.409212] env[61439]: ERROR nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. [ 771.409212] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 771.409212] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 771.409212] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 771.409212] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 771.409212] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 771.409212] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 771.409212] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 771.409212] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 771.409212] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 771.409212] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 771.409212] env[61439]: ERROR nova.compute.manager raise self.value [ 771.409212] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 771.409212] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 771.409212] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 771.409212] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 771.410106] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 771.410106] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 771.410106] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. [ 771.410106] env[61439]: ERROR nova.compute.manager [ 771.410106] env[61439]: Traceback (most recent call last): [ 771.410106] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 771.410106] env[61439]: listener.cb(fileno) [ 771.410106] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 771.410106] env[61439]: result = function(*args, **kwargs) [ 771.410106] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 771.410106] env[61439]: return func(*args, **kwargs) [ 771.410106] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 771.410106] env[61439]: raise e [ 771.410106] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 771.410106] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 771.410106] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 771.410106] env[61439]: created_port_ids = self._update_ports_for_instance( [ 771.410106] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 771.410106] env[61439]: with excutils.save_and_reraise_exception(): [ 771.410106] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 771.410106] env[61439]: self.force_reraise() [ 771.410106] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 771.410106] env[61439]: raise self.value [ 771.410106] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 771.410106] env[61439]: updated_port = self._update_port( [ 771.410106] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 771.410106] env[61439]: _ensure_no_port_binding_failure(port) [ 771.410106] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 771.410106] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 771.410785] env[61439]: nova.exception.PortBindingFailed: Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. [ 771.410785] env[61439]: Removing descriptor: 22 [ 771.410785] env[61439]: ERROR nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Traceback (most recent call last): [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] yield resources [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self.driver.spawn(context, instance, image_meta, [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 771.410785] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] vm_ref = self.build_virtual_machine(instance, [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] vif_infos = vmwarevif.get_vif_info(self._session, [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] for vif in network_info: [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] return self._sync_wrapper(fn, *args, **kwargs) [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self.wait() [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self[:] = self._gt.wait() [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] return self._exit_event.wait() [ 771.411113] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] result = hub.switch() [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] return self.greenlet.switch() [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] result = function(*args, **kwargs) [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] return func(*args, **kwargs) [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] raise e [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] nwinfo = self.network_api.allocate_for_instance( [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 771.411455] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] created_port_ids = self._update_ports_for_instance( [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] with excutils.save_and_reraise_exception(): [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self.force_reraise() [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] raise self.value [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] updated_port = self._update_port( [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] _ensure_no_port_binding_failure(port) [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 771.411795] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] raise exception.PortBindingFailed(port_id=port['id']) [ 771.412405] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] nova.exception.PortBindingFailed: Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. [ 771.412405] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] [ 771.412405] env[61439]: INFO nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Terminating instance [ 771.413072] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Acquiring lock "refresh_cache-559acc57-5718-41bc-aa69-a8ca3272b28f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 771.413072] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Acquired lock "refresh_cache-559acc57-5718-41bc-aa69-a8ca3272b28f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 771.413163] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 771.537762] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 771.638393] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 771.638892] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 771.811024] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 771.834652] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Releasing lock "refresh_cache-f6fba9ce-77a3-49cd-b52f-30d52e884a5b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 771.835241] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 771.835803] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3d9c3fd6-2a44-4e7f-ae86-8d44c86a85b5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 771.848638] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1302ede5-2215-4889-812d-1475785490d9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 771.880155] env[61439]: WARNING nova.virt.vmwareapi.driver [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance f6fba9ce-77a3-49cd-b52f-30d52e884a5b could not be found. [ 771.880427] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 771.880791] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3f04ab34-8645-4585-aedf-47b47848b8a2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 771.890594] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cb089ee-3794-4a63-a727-0258504c9a4a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 771.919091] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f6fba9ce-77a3-49cd-b52f-30d52e884a5b could not be found. [ 771.919091] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 771.919091] env[61439]: INFO nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Took 0.08 seconds to destroy the instance on the hypervisor. [ 771.919091] env[61439]: DEBUG oslo.service.loopingcall [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 771.919299] env[61439]: DEBUG nova.compute.manager [-] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 771.919299] env[61439]: DEBUG nova.network.neutron [-] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 771.983444] env[61439]: DEBUG nova.network.neutron [-] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 772.007065] env[61439]: DEBUG nova.network.neutron [-] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 772.022441] env[61439]: INFO nova.compute.manager [-] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Took 0.10 seconds to deallocate network for instance. [ 772.112050] env[61439]: INFO nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Took 0.09 seconds to detach 1 volumes for instance. [ 772.114544] env[61439]: DEBUG nova.compute.claims [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 772.114727] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.116086] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.147303] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "8e57e090-2330-49f5-b939-be657884e506" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.148145] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "8e57e090-2330-49f5-b939-be657884e506" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.170433] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 772.201911] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 772.201911] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 772.247312] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.377539] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbf01b39-fe6d-49d6-adba-f5cfb185d3c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.386592] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25cc1bd4-1605-45dc-94a8-bd181aa5c1b3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.419157] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-786c8dfb-7a03-4d09-84a4-aefe26b1cc1f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.429112] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7586f436-fd17-4ac1-9647-1d924c3f1b35 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.444134] env[61439]: DEBUG nova.compute.provider_tree [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 772.463597] env[61439]: DEBUG nova.scheduler.client.report [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 772.489357] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.374s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.489997] env[61439]: ERROR nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Traceback (most recent call last): [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self.driver.spawn(context, instance, image_meta, [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] vm_ref = self.build_virtual_machine(instance, [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] vif_infos = vmwarevif.get_vif_info(self._session, [ 772.489997] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] for vif in network_info: [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] return self._sync_wrapper(fn, *args, **kwargs) [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self.wait() [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self[:] = self._gt.wait() [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] return self._exit_event.wait() [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] result = hub.switch() [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 772.490387] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] return self.greenlet.switch() [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] result = function(*args, **kwargs) [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] return func(*args, **kwargs) [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] raise e [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] nwinfo = self.network_api.allocate_for_instance( [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] created_port_ids = self._update_ports_for_instance( [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] with excutils.save_and_reraise_exception(): [ 772.490771] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] self.force_reraise() [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] raise self.value [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] updated_port = self._update_port( [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] _ensure_no_port_binding_failure(port) [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] raise exception.PortBindingFailed(port_id=port['id']) [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] nova.exception.PortBindingFailed: Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. [ 772.491212] env[61439]: ERROR nova.compute.manager [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] [ 772.492215] env[61439]: DEBUG nova.compute.utils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 772.494103] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.247s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.495966] env[61439]: INFO nova.compute.claims [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 772.502167] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Build of instance f6fba9ce-77a3-49cd-b52f-30d52e884a5b was re-scheduled: Binding failed for port abf3dd49-7566-41b8-9aec-33cee02d4bca, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 772.502952] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 772.502952] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Acquiring lock "refresh_cache-f6fba9ce-77a3-49cd-b52f-30d52e884a5b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 772.502952] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Acquired lock "refresh_cache-f6fba9ce-77a3-49cd-b52f-30d52e884a5b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 772.503267] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 772.542223] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 772.556230] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Releasing lock "refresh_cache-559acc57-5718-41bc-aa69-a8ca3272b28f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 772.556230] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 772.556432] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 772.556960] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-af37fe71-7da9-489a-b7c5-b618df08250f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.573536] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf5401b4-6cc4-4f8d-a4c5-0ee49a6430f8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.610175] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 559acc57-5718-41bc-aa69-a8ca3272b28f could not be found. [ 772.610175] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 772.610175] env[61439]: INFO nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 772.610175] env[61439]: DEBUG oslo.service.loopingcall [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 772.610390] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 772.612028] env[61439]: DEBUG nova.compute.manager [-] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 772.612028] env[61439]: DEBUG nova.network.neutron [-] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 772.648231] env[61439]: DEBUG nova.network.neutron [-] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 772.660426] env[61439]: DEBUG nova.network.neutron [-] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 772.676812] env[61439]: INFO nova.compute.manager [-] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Took 0.06 seconds to deallocate network for instance. [ 772.678723] env[61439]: DEBUG nova.compute.claims [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 772.678959] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.777922] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69089b57-3c87-4393-9357-e6682bbfa6e5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.787634] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ae9d1fc-6766-4ccf-8819-b3fb4876510d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.823092] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c8de1ba-eec4-4731-ab99-b8dac9e8618a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.831508] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92514cc4-86a4-4ab1-93a6-873a89d15b84 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.847118] env[61439]: DEBUG nova.compute.provider_tree [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 772.857145] env[61439]: DEBUG nova.scheduler.client.report [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 772.873596] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.379s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.874119] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 772.880399] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.199s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.926818] env[61439]: DEBUG nova.compute.utils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 772.931138] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 772.931138] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 772.941206] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 773.056552] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 773.088102] env[61439]: ERROR nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. [ 773.088102] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 773.088102] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 773.088102] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 773.088102] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 773.088102] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 773.088102] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 773.088102] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 773.088102] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 773.088102] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 773.088102] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 773.088102] env[61439]: ERROR nova.compute.manager raise self.value [ 773.088102] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 773.088102] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 773.088102] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 773.088102] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 773.088644] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 773.088644] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 773.088644] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. [ 773.088644] env[61439]: ERROR nova.compute.manager [ 773.088644] env[61439]: Traceback (most recent call last): [ 773.088644] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 773.088644] env[61439]: listener.cb(fileno) [ 773.088644] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 773.088644] env[61439]: result = function(*args, **kwargs) [ 773.088644] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 773.088644] env[61439]: return func(*args, **kwargs) [ 773.088644] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 773.088644] env[61439]: raise e [ 773.088644] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 773.088644] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 773.088644] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 773.088644] env[61439]: created_port_ids = self._update_ports_for_instance( [ 773.088644] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 773.088644] env[61439]: with excutils.save_and_reraise_exception(): [ 773.088644] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 773.088644] env[61439]: self.force_reraise() [ 773.088644] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 773.088644] env[61439]: raise self.value [ 773.088644] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 773.088644] env[61439]: updated_port = self._update_port( [ 773.088644] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 773.088644] env[61439]: _ensure_no_port_binding_failure(port) [ 773.088644] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 773.088644] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 773.089365] env[61439]: nova.exception.PortBindingFailed: Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. [ 773.089365] env[61439]: Removing descriptor: 10 [ 773.089365] env[61439]: ERROR nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Traceback (most recent call last): [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] yield resources [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self.driver.spawn(context, instance, image_meta, [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 773.089365] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] vm_ref = self.build_virtual_machine(instance, [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] vif_infos = vmwarevif.get_vif_info(self._session, [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] for vif in network_info: [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] return self._sync_wrapper(fn, *args, **kwargs) [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self.wait() [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self[:] = self._gt.wait() [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] return self._exit_event.wait() [ 773.089662] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] result = hub.switch() [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] return self.greenlet.switch() [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] result = function(*args, **kwargs) [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] return func(*args, **kwargs) [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] raise e [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] nwinfo = self.network_api.allocate_for_instance( [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 773.090048] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] created_port_ids = self._update_ports_for_instance( [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] with excutils.save_and_reraise_exception(): [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self.force_reraise() [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] raise self.value [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] updated_port = self._update_port( [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] _ensure_no_port_binding_failure(port) [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 773.090368] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] raise exception.PortBindingFailed(port_id=port['id']) [ 773.090725] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] nova.exception.PortBindingFailed: Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. [ 773.090725] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] [ 773.090725] env[61439]: INFO nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Terminating instance [ 773.091371] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Acquiring lock "refresh_cache-5f05e360-888d-41c3-87ec-e4838dc5a2f2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 773.092327] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Acquired lock "refresh_cache-5f05e360-888d-41c3-87ec-e4838dc5a2f2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 773.092327] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 773.101159] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 773.101159] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 773.101159] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 773.101294] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 773.101294] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 773.101294] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 773.101294] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 773.101294] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 773.103638] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 773.103638] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 773.103917] env[61439]: DEBUG nova.virt.hardware [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 773.104912] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c14048e2-b8f0-48b7-a3be-863cc837c125 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.121672] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42d90217-3a3f-48f8-99de-354d043ae5a2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.163835] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b265c61-877d-489b-bcab-46a78cc47481 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.172557] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32eda016-f7a3-407d-95e2-56840cb934fa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.178676] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 773.184187] env[61439]: DEBUG nova.policy [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b861ada4972f4431b0b9bd46ae21f7cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16074166244d449b99488fc24f4f3d74', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 773.218316] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 773.220795] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 773.221533] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e845fabf-e937-4ab9-bf54-8232cd7d0029 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.236234] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38b8b1e9-40f7-4217-9dc5-5a88ebef7fbb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.251294] env[61439]: DEBUG nova.compute.provider_tree [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 773.266258] env[61439]: DEBUG nova.scheduler.client.report [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 773.294074] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.416s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.294837] env[61439]: ERROR nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Traceback (most recent call last): [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self.driver.spawn(context, instance, image_meta, [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] vm_ref = self.build_virtual_machine(instance, [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] vif_infos = vmwarevif.get_vif_info(self._session, [ 773.294837] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] for vif in network_info: [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] return self._sync_wrapper(fn, *args, **kwargs) [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self.wait() [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self[:] = self._gt.wait() [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] return self._exit_event.wait() [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] result = hub.switch() [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 773.295205] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] return self.greenlet.switch() [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] result = function(*args, **kwargs) [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] return func(*args, **kwargs) [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] raise e [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] nwinfo = self.network_api.allocate_for_instance( [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] created_port_ids = self._update_ports_for_instance( [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] with excutils.save_and_reraise_exception(): [ 773.295563] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] self.force_reraise() [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] raise self.value [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] updated_port = self._update_port( [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] _ensure_no_port_binding_failure(port) [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] raise exception.PortBindingFailed(port_id=port['id']) [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] nova.exception.PortBindingFailed: Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. [ 773.295872] env[61439]: ERROR nova.compute.manager [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] [ 773.296209] env[61439]: DEBUG nova.compute.utils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 773.297546] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Build of instance 559acc57-5718-41bc-aa69-a8ca3272b28f was re-scheduled: Binding failed for port e094db18-1c04-4a5c-b1b3-977f24a0d11f, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 773.300259] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 773.300259] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Acquiring lock "refresh_cache-559acc57-5718-41bc-aa69-a8ca3272b28f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 773.300259] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Acquired lock "refresh_cache-559acc57-5718-41bc-aa69-a8ca3272b28f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 773.300259] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 773.435056] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 773.444230] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 773.459393] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Releasing lock "refresh_cache-f6fba9ce-77a3-49cd-b52f-30d52e884a5b" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 773.459668] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 773.459844] env[61439]: DEBUG nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 773.460114] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 773.555235] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 773.566956] env[61439]: DEBUG nova.network.neutron [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 773.579232] env[61439]: INFO nova.compute.manager [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] [instance: f6fba9ce-77a3-49cd-b52f-30d52e884a5b] Took 0.12 seconds to deallocate network for instance. [ 773.696014] env[61439]: INFO nova.scheduler.client.report [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Deleted allocations for instance f6fba9ce-77a3-49cd-b52f-30d52e884a5b [ 773.722996] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f90f22bf-7498-4ad2-8bdc-513b3d956a88 tempest-ServersTestBootFromVolume-1411109147 tempest-ServersTestBootFromVolume-1411109147-project-member] Lock "f6fba9ce-77a3-49cd-b52f-30d52e884a5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.726s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.837063] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 773.852221] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Releasing lock "refresh_cache-5f05e360-888d-41c3-87ec-e4838dc5a2f2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 773.852668] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 773.852855] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 773.853938] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d232641d-bd1b-49d7-9ec8-a2ae7977e974 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.865128] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9db5bb8-85e5-44ab-bd07-d788591fa051 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.892887] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5f05e360-888d-41c3-87ec-e4838dc5a2f2 could not be found. [ 773.893203] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 773.893420] env[61439]: INFO nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 773.893690] env[61439]: DEBUG oslo.service.loopingcall [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 773.894237] env[61439]: DEBUG nova.compute.manager [-] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 773.894340] env[61439]: DEBUG nova.network.neutron [-] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 773.932538] env[61439]: DEBUG nova.network.neutron [-] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 773.942659] env[61439]: DEBUG nova.network.neutron [-] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 773.952110] env[61439]: INFO nova.compute.manager [-] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Took 0.06 seconds to deallocate network for instance. [ 773.954332] env[61439]: DEBUG nova.compute.claims [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 773.954484] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 773.954916] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 774.162923] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae3df896-3a67-4961-8d80-0b7b8567165a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.172544] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb989ba0-c057-4a6c-8854-0e9dd720c07a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.210806] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf1e0159-d089-4f74-abc1-8d97b60a91ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.221214] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdc53785-0e03-476d-9c08-9c4ec5bd8359 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 774.235354] env[61439]: DEBUG nova.compute.provider_tree [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 774.245587] env[61439]: DEBUG nova.scheduler.client.report [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 774.268998] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.314s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.269656] env[61439]: ERROR nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Traceback (most recent call last): [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self.driver.spawn(context, instance, image_meta, [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] vm_ref = self.build_virtual_machine(instance, [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] vif_infos = vmwarevif.get_vif_info(self._session, [ 774.269656] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] for vif in network_info: [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] return self._sync_wrapper(fn, *args, **kwargs) [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self.wait() [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self[:] = self._gt.wait() [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] return self._exit_event.wait() [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] result = hub.switch() [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 774.270175] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] return self.greenlet.switch() [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] result = function(*args, **kwargs) [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] return func(*args, **kwargs) [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] raise e [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] nwinfo = self.network_api.allocate_for_instance( [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] created_port_ids = self._update_ports_for_instance( [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] with excutils.save_and_reraise_exception(): [ 774.270840] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] self.force_reraise() [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] raise self.value [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] updated_port = self._update_port( [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] _ensure_no_port_binding_failure(port) [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] raise exception.PortBindingFailed(port_id=port['id']) [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] nova.exception.PortBindingFailed: Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. [ 774.271528] env[61439]: ERROR nova.compute.manager [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] [ 774.272010] env[61439]: DEBUG nova.compute.utils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 774.272334] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Build of instance 5f05e360-888d-41c3-87ec-e4838dc5a2f2 was re-scheduled: Binding failed for port aa7b8e70-ec98-4d58-a0c7-7c7ea7927b18, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 774.272779] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 774.273275] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Acquiring lock "refresh_cache-5f05e360-888d-41c3-87ec-e4838dc5a2f2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 774.273275] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Acquired lock "refresh_cache-5f05e360-888d-41c3-87ec-e4838dc5a2f2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 774.273477] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 774.296185] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 774.313054] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Releasing lock "refresh_cache-559acc57-5718-41bc-aa69-a8ca3272b28f" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 774.313054] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 774.313054] env[61439]: DEBUG nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 774.313054] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 774.377769] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 774.388690] env[61439]: DEBUG nova.network.neutron [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 774.394198] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 774.403782] env[61439]: INFO nova.compute.manager [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] [instance: 559acc57-5718-41bc-aa69-a8ca3272b28f] Took 0.09 seconds to deallocate network for instance. [ 774.554076] env[61439]: INFO nova.scheduler.client.report [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Deleted allocations for instance 559acc57-5718-41bc-aa69-a8ca3272b28f [ 774.583829] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c95bb8b6-21a7-446c-b23b-79e28e377fe9 tempest-InstanceActionsTestJSON-1957432545 tempest-InstanceActionsTestJSON-1957432545-project-member] Lock "559acc57-5718-41bc-aa69-a8ca3272b28f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.771s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.760112] env[61439]: ERROR nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. [ 774.760112] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 774.760112] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 774.760112] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 774.760112] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 774.760112] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 774.760112] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 774.760112] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 774.760112] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 774.760112] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 774.760112] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 774.760112] env[61439]: ERROR nova.compute.manager raise self.value [ 774.760112] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 774.760112] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 774.760112] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 774.760112] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 774.760589] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 774.760589] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 774.760589] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. [ 774.760589] env[61439]: ERROR nova.compute.manager [ 774.760589] env[61439]: Traceback (most recent call last): [ 774.760589] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 774.760589] env[61439]: listener.cb(fileno) [ 774.760589] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 774.760589] env[61439]: result = function(*args, **kwargs) [ 774.760589] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 774.760589] env[61439]: return func(*args, **kwargs) [ 774.760589] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 774.760589] env[61439]: raise e [ 774.760589] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 774.760589] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 774.760589] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 774.760589] env[61439]: created_port_ids = self._update_ports_for_instance( [ 774.760589] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 774.760589] env[61439]: with excutils.save_and_reraise_exception(): [ 774.760589] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 774.760589] env[61439]: self.force_reraise() [ 774.760589] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 774.760589] env[61439]: raise self.value [ 774.760589] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 774.760589] env[61439]: updated_port = self._update_port( [ 774.760589] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 774.760589] env[61439]: _ensure_no_port_binding_failure(port) [ 774.760589] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 774.760589] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 774.761385] env[61439]: nova.exception.PortBindingFailed: Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. [ 774.761385] env[61439]: Removing descriptor: 24 [ 774.761385] env[61439]: ERROR nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Traceback (most recent call last): [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] yield resources [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self.driver.spawn(context, instance, image_meta, [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self._vmops.spawn(context, instance, image_meta, injected_files, [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 774.761385] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] vm_ref = self.build_virtual_machine(instance, [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] vif_infos = vmwarevif.get_vif_info(self._session, [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] for vif in network_info: [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] return self._sync_wrapper(fn, *args, **kwargs) [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self.wait() [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self[:] = self._gt.wait() [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] return self._exit_event.wait() [ 774.761716] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] result = hub.switch() [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] return self.greenlet.switch() [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] result = function(*args, **kwargs) [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] return func(*args, **kwargs) [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] raise e [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] nwinfo = self.network_api.allocate_for_instance( [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 774.762077] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] created_port_ids = self._update_ports_for_instance( [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] with excutils.save_and_reraise_exception(): [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self.force_reraise() [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] raise self.value [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] updated_port = self._update_port( [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] _ensure_no_port_binding_failure(port) [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 774.762696] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] raise exception.PortBindingFailed(port_id=port['id']) [ 774.763071] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] nova.exception.PortBindingFailed: Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. [ 774.763071] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] [ 774.763071] env[61439]: INFO nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Terminating instance [ 774.765170] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "refresh_cache-45a24553-3e31-4187-81fe-e9d1c8ca9353" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 774.765343] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquired lock "refresh_cache-45a24553-3e31-4187-81fe-e9d1c8ca9353" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 774.765523] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 774.857710] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 775.124081] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 775.141861] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Releasing lock "refresh_cache-5f05e360-888d-41c3-87ec-e4838dc5a2f2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 775.142143] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 775.142565] env[61439]: DEBUG nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 775.142765] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 775.208697] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 775.217387] env[61439]: DEBUG nova.network.neutron [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 775.228410] env[61439]: INFO nova.compute.manager [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] [instance: 5f05e360-888d-41c3-87ec-e4838dc5a2f2] Took 0.08 seconds to deallocate network for instance. [ 775.277164] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Successfully created port: b9d0b31b-5bae-4e87-b3ee-21e3791aedd3 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 775.364784] env[61439]: INFO nova.scheduler.client.report [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Deleted allocations for instance 5f05e360-888d-41c3-87ec-e4838dc5a2f2 [ 775.376056] env[61439]: ERROR nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. [ 775.376056] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 775.376056] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 775.376056] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 775.376056] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 775.376056] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 775.376056] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 775.376056] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 775.376056] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 775.376056] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 775.376056] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 775.376056] env[61439]: ERROR nova.compute.manager raise self.value [ 775.376056] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 775.376056] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 775.376056] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 775.376056] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 775.376592] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 775.376592] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 775.376592] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. [ 775.376592] env[61439]: ERROR nova.compute.manager [ 775.376592] env[61439]: Traceback (most recent call last): [ 775.376592] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 775.376592] env[61439]: listener.cb(fileno) [ 775.376592] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 775.376592] env[61439]: result = function(*args, **kwargs) [ 775.376592] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 775.376592] env[61439]: return func(*args, **kwargs) [ 775.376592] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 775.376592] env[61439]: raise e [ 775.376592] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 775.376592] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 775.376592] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 775.376592] env[61439]: created_port_ids = self._update_ports_for_instance( [ 775.376592] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 775.376592] env[61439]: with excutils.save_and_reraise_exception(): [ 775.376592] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 775.376592] env[61439]: self.force_reraise() [ 775.376592] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 775.376592] env[61439]: raise self.value [ 775.376592] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 775.376592] env[61439]: updated_port = self._update_port( [ 775.376592] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 775.376592] env[61439]: _ensure_no_port_binding_failure(port) [ 775.376592] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 775.376592] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 775.377299] env[61439]: nova.exception.PortBindingFailed: Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. [ 775.377299] env[61439]: Removing descriptor: 21 [ 775.377299] env[61439]: ERROR nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Traceback (most recent call last): [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] yield resources [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self.driver.spawn(context, instance, image_meta, [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 775.377299] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] vm_ref = self.build_virtual_machine(instance, [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] vif_infos = vmwarevif.get_vif_info(self._session, [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] for vif in network_info: [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] return self._sync_wrapper(fn, *args, **kwargs) [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self.wait() [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self[:] = self._gt.wait() [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] return self._exit_event.wait() [ 775.377656] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] result = hub.switch() [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] return self.greenlet.switch() [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] result = function(*args, **kwargs) [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] return func(*args, **kwargs) [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] raise e [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] nwinfo = self.network_api.allocate_for_instance( [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 775.377980] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] created_port_ids = self._update_ports_for_instance( [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] with excutils.save_and_reraise_exception(): [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self.force_reraise() [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] raise self.value [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] updated_port = self._update_port( [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] _ensure_no_port_binding_failure(port) [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 775.378461] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] raise exception.PortBindingFailed(port_id=port['id']) [ 775.378757] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] nova.exception.PortBindingFailed: Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. [ 775.378757] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] [ 775.378848] env[61439]: INFO nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Terminating instance [ 775.381316] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Acquiring lock "refresh_cache-0c167874-85c7-45bf-b296-971cdae1fe6a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 775.381480] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Acquired lock "refresh_cache-0c167874-85c7-45bf-b296-971cdae1fe6a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 775.381677] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 775.400251] env[61439]: DEBUG oslo_concurrency.lockutils [None req-728a5c1b-d451-40f7-959a-724e37ca2312 tempest-ServerAddressesNegativeTestJSON-605632328 tempest-ServerAddressesNegativeTestJSON-605632328-project-member] Lock "5f05e360-888d-41c3-87ec-e4838dc5a2f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.577s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 775.487009] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 775.774843] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 775.793784] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Releasing lock "refresh_cache-45a24553-3e31-4187-81fe-e9d1c8ca9353" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 775.796483] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 775.796728] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 775.797293] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-77c71ec2-223a-43ba-a8bb-1328b6c91613 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.810525] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73837ac8-bc79-4846-9f13-1fcbf4e2d552 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.836145] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 45a24553-3e31-4187-81fe-e9d1c8ca9353 could not be found. [ 775.836393] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 775.836571] env[61439]: INFO nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Took 0.04 seconds to destroy the instance on the hypervisor. [ 775.836819] env[61439]: DEBUG oslo.service.loopingcall [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 775.837072] env[61439]: DEBUG nova.compute.manager [-] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 775.837748] env[61439]: DEBUG nova.network.neutron [-] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 775.906844] env[61439]: DEBUG nova.network.neutron [-] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 775.919199] env[61439]: DEBUG nova.network.neutron [-] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 775.930997] env[61439]: INFO nova.compute.manager [-] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Took 0.09 seconds to deallocate network for instance. [ 775.933581] env[61439]: DEBUG nova.compute.claims [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 775.933774] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 775.933999] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 775.996576] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 776.013516] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Releasing lock "refresh_cache-0c167874-85c7-45bf-b296-971cdae1fe6a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 776.013516] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 776.013516] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 776.013516] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a0ff0db3-151e-45e8-ad34-75a0fad81184 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.024725] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cebc24c2-b00c-405c-afc7-7195af2cf4d5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.053481] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0c167874-85c7-45bf-b296-971cdae1fe6a could not be found. [ 776.053762] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 776.054019] env[61439]: INFO nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 776.054206] env[61439]: DEBUG oslo.service.loopingcall [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 776.057352] env[61439]: DEBUG nova.compute.manager [-] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 776.058454] env[61439]: DEBUG nova.network.neutron [-] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 776.104202] env[61439]: DEBUG nova.network.neutron [-] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 776.117823] env[61439]: DEBUG nova.network.neutron [-] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 776.128016] env[61439]: INFO nova.compute.manager [-] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Took 0.07 seconds to deallocate network for instance. [ 776.130589] env[61439]: DEBUG nova.compute.claims [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 776.130702] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 776.155865] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d159de98-2353-4c06-89bb-b2a96259ad39 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.166031] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d91bc353-3674-4c9f-8483-b34fd0f75871 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.199845] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17337fec-f64e-4d7b-8820-67c7d4aa036a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.208646] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-330a72ba-be1f-4919-b108-d7d69ac5ce4d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.222722] env[61439]: DEBUG nova.compute.provider_tree [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 776.234732] env[61439]: DEBUG nova.scheduler.client.report [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 776.251201] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.317s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 776.251832] env[61439]: ERROR nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Traceback (most recent call last): [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self.driver.spawn(context, instance, image_meta, [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self._vmops.spawn(context, instance, image_meta, injected_files, [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] vm_ref = self.build_virtual_machine(instance, [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] vif_infos = vmwarevif.get_vif_info(self._session, [ 776.251832] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] for vif in network_info: [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] return self._sync_wrapper(fn, *args, **kwargs) [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self.wait() [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self[:] = self._gt.wait() [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] return self._exit_event.wait() [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] result = hub.switch() [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 776.252806] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] return self.greenlet.switch() [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] result = function(*args, **kwargs) [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] return func(*args, **kwargs) [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] raise e [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] nwinfo = self.network_api.allocate_for_instance( [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] created_port_ids = self._update_ports_for_instance( [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] with excutils.save_and_reraise_exception(): [ 776.253370] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] self.force_reraise() [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] raise self.value [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] updated_port = self._update_port( [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] _ensure_no_port_binding_failure(port) [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] raise exception.PortBindingFailed(port_id=port['id']) [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] nova.exception.PortBindingFailed: Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. [ 776.254038] env[61439]: ERROR nova.compute.manager [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] [ 776.254955] env[61439]: DEBUG nova.compute.utils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 776.254955] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.123s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 776.260017] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Build of instance 45a24553-3e31-4187-81fe-e9d1c8ca9353 was re-scheduled: Binding failed for port be1fb9f7-258b-4aca-b899-f1818b72b1f4, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 776.260017] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 776.260017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "refresh_cache-45a24553-3e31-4187-81fe-e9d1c8ca9353" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 776.260017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquired lock "refresh_cache-45a24553-3e31-4187-81fe-e9d1c8ca9353" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 776.260244] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 776.319340] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 776.409846] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ba19ddd-7ab1-45a7-aae9-69e6b5f83688 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.418249] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81e12085-10e9-42a8-8ef7-cf3d3ac28473 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.452042] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5102557f-3283-490a-89a1-3c82e5074fd7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.461465] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-729330f9-28ac-4bfa-8efc-898869c33d1e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.479935] env[61439]: DEBUG nova.compute.provider_tree [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 776.491920] env[61439]: DEBUG nova.scheduler.client.report [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 776.515962] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.262s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 776.516661] env[61439]: ERROR nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Traceback (most recent call last): [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self.driver.spawn(context, instance, image_meta, [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] vm_ref = self.build_virtual_machine(instance, [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] vif_infos = vmwarevif.get_vif_info(self._session, [ 776.516661] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] for vif in network_info: [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] return self._sync_wrapper(fn, *args, **kwargs) [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self.wait() [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self[:] = self._gt.wait() [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] return self._exit_event.wait() [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] result = hub.switch() [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 776.517087] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] return self.greenlet.switch() [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] result = function(*args, **kwargs) [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] return func(*args, **kwargs) [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] raise e [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] nwinfo = self.network_api.allocate_for_instance( [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] created_port_ids = self._update_ports_for_instance( [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] with excutils.save_and_reraise_exception(): [ 776.517415] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] self.force_reraise() [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] raise self.value [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] updated_port = self._update_port( [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] _ensure_no_port_binding_failure(port) [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] raise exception.PortBindingFailed(port_id=port['id']) [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] nova.exception.PortBindingFailed: Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. [ 776.517724] env[61439]: ERROR nova.compute.manager [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] [ 776.518020] env[61439]: DEBUG nova.compute.utils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 776.519517] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Build of instance 0c167874-85c7-45bf-b296-971cdae1fe6a was re-scheduled: Binding failed for port d855f0dd-5b33-4e2d-8570-9f285221250a, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 776.520134] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 776.520331] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Acquiring lock "refresh_cache-0c167874-85c7-45bf-b296-971cdae1fe6a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 776.520497] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Acquired lock "refresh_cache-0c167874-85c7-45bf-b296-971cdae1fe6a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 776.520675] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 776.707777] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 776.994607] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 777.007703] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Releasing lock "refresh_cache-45a24553-3e31-4187-81fe-e9d1c8ca9353" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 777.007955] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 777.008154] env[61439]: DEBUG nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 777.008326] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 777.072410] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 777.085773] env[61439]: DEBUG nova.network.neutron [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 777.097225] env[61439]: INFO nova.compute.manager [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 45a24553-3e31-4187-81fe-e9d1c8ca9353] Took 0.09 seconds to deallocate network for instance. [ 777.188561] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 777.208936] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Releasing lock "refresh_cache-0c167874-85c7-45bf-b296-971cdae1fe6a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 777.210017] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 777.210017] env[61439]: DEBUG nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 777.210017] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 777.261206] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 777.277017] env[61439]: DEBUG nova.network.neutron [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 777.287894] env[61439]: INFO nova.compute.manager [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] [instance: 0c167874-85c7-45bf-b296-971cdae1fe6a] Took 0.08 seconds to deallocate network for instance. [ 777.300039] env[61439]: INFO nova.scheduler.client.report [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Deleted allocations for instance 45a24553-3e31-4187-81fe-e9d1c8ca9353 [ 777.331242] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e3a3c194-1149-4f8a-a621-f3803180dc97 tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "45a24553-3e31-4187-81fe-e9d1c8ca9353" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.097s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 777.413825] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "c5e39fc1-989c-450b-9211-06cdd39700e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 777.415347] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "c5e39fc1-989c-450b-9211-06cdd39700e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 777.433289] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 777.455841] env[61439]: INFO nova.scheduler.client.report [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Deleted allocations for instance 0c167874-85c7-45bf-b296-971cdae1fe6a [ 777.503925] env[61439]: DEBUG oslo_concurrency.lockutils [None req-83dc816c-5698-40b0-b3e1-213289a4973e tempest-ServersNegativeTestJSON-1062426749 tempest-ServersNegativeTestJSON-1062426749-project-member] Lock "0c167874-85c7-45bf-b296-971cdae1fe6a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.109s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 777.528971] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 777.529284] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 777.534871] env[61439]: INFO nova.compute.claims [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 777.716713] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0697489-6b8d-4871-a1d0-4091b3b86e1b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.727096] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e035fd92-26bd-48e9-b2b1-835692c5f8e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.766059] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebb65bfe-1ea5-48c2-9d44-35cb1228b92e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.776696] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab48f961-3cc0-4e19-b3b2-19139052380c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.792290] env[61439]: DEBUG nova.compute.provider_tree [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 777.801612] env[61439]: DEBUG nova.scheduler.client.report [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 777.823971] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 777.824780] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 777.871255] env[61439]: DEBUG nova.compute.utils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 777.871410] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 777.871789] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 777.883727] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 777.957255] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 777.984073] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 777.984477] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 777.984477] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 777.984680] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 777.984839] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 777.984979] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 777.985336] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 777.985517] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 777.985701] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 777.985843] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 777.986021] env[61439]: DEBUG nova.virt.hardware [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 777.986897] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0afb46a8-dd08-434a-984e-7e521a58a051 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.997743] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38258b4e-c5f3-468f-9f25-2d74dfcb44e3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.257356] env[61439]: DEBUG nova.policy [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af2fd8431af45ca891f744f4d10b54f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca364a2df93a424f8b66ee39d9b0b120', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 779.233254] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Successfully created port: 10e6e847-4841-4451-bb83-a18e31397cf3 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 780.280460] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "c5e39fc1-989c-450b-9211-06cdd39700e6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.189959] env[61439]: ERROR nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. [ 781.189959] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 781.189959] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 781.189959] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 781.189959] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 781.189959] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 781.189959] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 781.189959] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 781.189959] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 781.189959] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 781.189959] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 781.189959] env[61439]: ERROR nova.compute.manager raise self.value [ 781.189959] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 781.189959] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 781.189959] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 781.189959] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 781.190536] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 781.190536] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 781.190536] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. [ 781.190536] env[61439]: ERROR nova.compute.manager [ 781.190536] env[61439]: Traceback (most recent call last): [ 781.190536] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 781.190536] env[61439]: listener.cb(fileno) [ 781.190536] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 781.190536] env[61439]: result = function(*args, **kwargs) [ 781.190536] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 781.190536] env[61439]: return func(*args, **kwargs) [ 781.190536] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 781.190536] env[61439]: raise e [ 781.190536] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 781.190536] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 781.190536] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 781.190536] env[61439]: created_port_ids = self._update_ports_for_instance( [ 781.190536] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 781.190536] env[61439]: with excutils.save_and_reraise_exception(): [ 781.190536] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 781.190536] env[61439]: self.force_reraise() [ 781.190536] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 781.190536] env[61439]: raise self.value [ 781.190536] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 781.190536] env[61439]: updated_port = self._update_port( [ 781.190536] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 781.190536] env[61439]: _ensure_no_port_binding_failure(port) [ 781.190536] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 781.190536] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 781.191428] env[61439]: nova.exception.PortBindingFailed: Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. [ 781.191428] env[61439]: Removing descriptor: 22 [ 781.191428] env[61439]: ERROR nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] Traceback (most recent call last): [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] yield resources [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self.driver.spawn(context, instance, image_meta, [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self._vmops.spawn(context, instance, image_meta, injected_files, [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 781.191428] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] vm_ref = self.build_virtual_machine(instance, [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] vif_infos = vmwarevif.get_vif_info(self._session, [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] for vif in network_info: [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] return self._sync_wrapper(fn, *args, **kwargs) [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self.wait() [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self[:] = self._gt.wait() [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] return self._exit_event.wait() [ 781.191796] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] result = hub.switch() [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] return self.greenlet.switch() [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] result = function(*args, **kwargs) [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] return func(*args, **kwargs) [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] raise e [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] nwinfo = self.network_api.allocate_for_instance( [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 781.192182] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] created_port_ids = self._update_ports_for_instance( [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] with excutils.save_and_reraise_exception(): [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self.force_reraise() [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] raise self.value [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] updated_port = self._update_port( [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] _ensure_no_port_binding_failure(port) [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 781.192570] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] raise exception.PortBindingFailed(port_id=port['id']) [ 781.192920] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] nova.exception.PortBindingFailed: Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. [ 781.192920] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] [ 781.192920] env[61439]: INFO nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Terminating instance [ 781.196035] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-8e57e090-2330-49f5-b939-be657884e506" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 781.196035] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-8e57e090-2330-49f5-b939-be657884e506" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 781.196035] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 781.239159] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 781.338369] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "93635056-506d-4672-ba46-d57e919df841" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.338724] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "93635056-506d-4672-ba46-d57e919df841" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 781.348952] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 781.424867] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.424867] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 781.424867] env[61439]: INFO nova.compute.claims [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 781.622019] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ef1b6d2-305e-4dd5-8075-380995da0362 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.632573] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a23cc99-3499-4819-bb03-969ed3cec7f3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.666693] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 781.671184] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f953cc8b-51ad-428d-b3c6-1d0935580f03 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.678130] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e68ef235-231e-40a3-8622-c2509cf2aa2b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.683454] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-8e57e090-2330-49f5-b939-be657884e506" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 781.683882] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 781.684093] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 781.685017] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b7786939-75d2-4f4e-98fb-fceb65331b5e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.696172] env[61439]: DEBUG nova.compute.provider_tree [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 781.704508] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eee0a2e-9bb9-4bc1-ac45-fea7d8927ea6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.720631] env[61439]: DEBUG nova.scheduler.client.report [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 781.735831] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8e57e090-2330-49f5-b939-be657884e506 could not be found. [ 781.739247] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 781.739247] env[61439]: INFO nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Took 0.05 seconds to destroy the instance on the hypervisor. [ 781.739247] env[61439]: DEBUG oslo.service.loopingcall [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 781.739247] env[61439]: DEBUG nova.compute.manager [-] [instance: 8e57e090-2330-49f5-b939-be657884e506] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 781.739247] env[61439]: DEBUG nova.network.neutron [-] [instance: 8e57e090-2330-49f5-b939-be657884e506] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 781.746965] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 781.746965] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 781.801356] env[61439]: DEBUG nova.compute.utils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 781.803195] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 781.803421] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 781.822849] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 781.906563] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 781.935526] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 781.935762] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 781.935918] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 781.936122] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 781.936278] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 781.936427] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 781.936732] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 781.936792] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 781.937053] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 781.937251] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 781.937430] env[61439]: DEBUG nova.virt.hardware [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 781.938310] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd53030b-f77b-4ac0-852f-90b6e43cd768 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.947366] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78b6e9b4-6c8f-4dc8-b75d-9049c2736b24 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.986682] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Acquiring lock "170dbc1c-e9c8-4b23-a556-c6e212beda24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.986920] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Lock "170dbc1c-e9c8-4b23-a556-c6e212beda24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 781.997547] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 782.025272] env[61439]: DEBUG nova.network.neutron [-] [instance: 8e57e090-2330-49f5-b939-be657884e506] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 782.034472] env[61439]: DEBUG nova.network.neutron [-] [instance: 8e57e090-2330-49f5-b939-be657884e506] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 782.053729] env[61439]: INFO nova.compute.manager [-] [instance: 8e57e090-2330-49f5-b939-be657884e506] Took 0.32 seconds to deallocate network for instance. [ 782.057783] env[61439]: DEBUG nova.compute.claims [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 782.057971] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 782.058200] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 782.079565] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 782.169280] env[61439]: DEBUG nova.policy [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '721ff5c63c5f405bb7be8c486b5fd162', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd841db9575854aa388acc0bbb499fd52', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 782.243659] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-999087a2-4b39-4b4a-8ccb-3783417653a1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.252338] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec9ae6ba-d583-4182-8cc3-4c6cb583d285 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.287882] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb412c67-91ef-4a0f-9e73-38ae02869b97 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.296347] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d4a2781-859d-48b9-abe3-9d6b890197f0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.315150] env[61439]: DEBUG nova.compute.provider_tree [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 782.327835] env[61439]: DEBUG nova.scheduler.client.report [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 782.351279] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.293s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 782.351767] env[61439]: ERROR nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] Traceback (most recent call last): [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self.driver.spawn(context, instance, image_meta, [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self._vmops.spawn(context, instance, image_meta, injected_files, [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] vm_ref = self.build_virtual_machine(instance, [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] vif_infos = vmwarevif.get_vif_info(self._session, [ 782.351767] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] for vif in network_info: [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] return self._sync_wrapper(fn, *args, **kwargs) [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self.wait() [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self[:] = self._gt.wait() [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] return self._exit_event.wait() [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] result = hub.switch() [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 782.352792] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] return self.greenlet.switch() [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] result = function(*args, **kwargs) [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] return func(*args, **kwargs) [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] raise e [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] nwinfo = self.network_api.allocate_for_instance( [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] created_port_ids = self._update_ports_for_instance( [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] with excutils.save_and_reraise_exception(): [ 782.353118] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] self.force_reraise() [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] raise self.value [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] updated_port = self._update_port( [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] _ensure_no_port_binding_failure(port) [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] raise exception.PortBindingFailed(port_id=port['id']) [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] nova.exception.PortBindingFailed: Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. [ 782.353440] env[61439]: ERROR nova.compute.manager [instance: 8e57e090-2330-49f5-b939-be657884e506] [ 782.353710] env[61439]: DEBUG nova.compute.utils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 782.354042] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.275s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 782.357279] env[61439]: INFO nova.compute.claims [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 782.363988] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Build of instance 8e57e090-2330-49f5-b939-be657884e506 was re-scheduled: Binding failed for port b9d0b31b-5bae-4e87-b3ee-21e3791aedd3, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 782.367650] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 782.367650] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-8e57e090-2330-49f5-b939-be657884e506" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 782.367650] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-8e57e090-2330-49f5-b939-be657884e506" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 782.367650] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 782.549292] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 782.600340] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ce21d92-9d4b-4c63-bca8-79eba4d133a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.612533] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e861d2a-6ab5-4d8b-8309-e3582a9f93a2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.648779] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e3dd6da-7367-4c71-8a04-fe8472e08b43 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.657065] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73b7ebe1-065c-41fd-9432-457a363af66d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.673067] env[61439]: DEBUG nova.compute.provider_tree [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 782.684345] env[61439]: DEBUG nova.scheduler.client.report [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 782.706211] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 782.706777] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 782.764685] env[61439]: DEBUG nova.compute.utils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 782.766880] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 782.766880] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 782.782151] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 782.863632] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 782.897678] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 782.897678] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 782.897678] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 782.897903] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 782.898029] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 782.898364] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 782.898364] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 782.898535] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 782.898707] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 782.899231] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 782.899443] env[61439]: DEBUG nova.virt.hardware [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 782.900322] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ce71bc2-13d7-4d77-bfed-4808a1e1df0d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.910802] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ca48300-744f-4e88-a919-22778da49dba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 783.347882] env[61439]: DEBUG nova.policy [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b15100e7cea84902abad60ded4d01420', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc8187e781f41419aa48bdc98846e68', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 783.550164] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 783.572599] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-8e57e090-2330-49f5-b939-be657884e506" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 783.572599] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 783.572757] env[61439]: DEBUG nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 783.574724] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 783.672070] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 783.688606] env[61439]: DEBUG nova.network.neutron [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 783.711719] env[61439]: INFO nova.compute.manager [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 8e57e090-2330-49f5-b939-be657884e506] Took 0.14 seconds to deallocate network for instance. [ 783.857075] env[61439]: INFO nova.scheduler.client.report [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Deleted allocations for instance 8e57e090-2330-49f5-b939-be657884e506 [ 783.869944] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Successfully created port: 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 783.893652] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8c3d28b1-ea37-4968-bb3a-afd25201211b tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "8e57e090-2330-49f5-b939-be657884e506" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.746s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 783.985903] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Acquiring lock "6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.986158] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Lock "6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 784.012476] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 784.077975] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 784.077975] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 784.078945] env[61439]: INFO nova.compute.claims [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 784.311406] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e623b4b4-10fd-4064-a390-a4ed799f945e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.320221] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91de2269-e67b-4b52-86f3-7e5bad421bb6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.353622] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f65e15d3-12e0-45bc-820a-8bd18db0058b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.363444] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b67a8ec5-cfdd-44e0-9e47-055a88c8d5d0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.380583] env[61439]: DEBUG nova.compute.provider_tree [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 784.394748] env[61439]: DEBUG nova.scheduler.client.report [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 784.417240] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 784.417240] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 784.463722] env[61439]: DEBUG nova.compute.utils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 784.466025] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 784.466025] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 784.483249] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 784.632285] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 784.666471] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 784.666772] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 784.666875] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 784.667099] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 784.667686] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 784.667686] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 784.667686] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 784.667815] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 784.667963] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 784.668256] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 784.668462] env[61439]: DEBUG nova.virt.hardware [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 784.669639] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-620fdf8d-76d0-49c9-bf57-2c42d47c6dbe {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.681021] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d231bf9-ed9b-45e1-86d0-fc40af7a5d2a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 784.761110] env[61439]: ERROR nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. [ 784.761110] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 784.761110] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 784.761110] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 784.761110] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 784.761110] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 784.761110] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 784.761110] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 784.761110] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 784.761110] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 784.761110] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 784.761110] env[61439]: ERROR nova.compute.manager raise self.value [ 784.761110] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 784.761110] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 784.761110] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 784.761110] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 784.761562] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 784.761562] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 784.761562] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. [ 784.761562] env[61439]: ERROR nova.compute.manager [ 784.761562] env[61439]: Traceback (most recent call last): [ 784.761562] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 784.761562] env[61439]: listener.cb(fileno) [ 784.761562] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 784.761562] env[61439]: result = function(*args, **kwargs) [ 784.761562] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 784.761562] env[61439]: return func(*args, **kwargs) [ 784.761562] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 784.761562] env[61439]: raise e [ 784.761562] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 784.761562] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 784.761562] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 784.761562] env[61439]: created_port_ids = self._update_ports_for_instance( [ 784.761562] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 784.761562] env[61439]: with excutils.save_and_reraise_exception(): [ 784.761562] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 784.761562] env[61439]: self.force_reraise() [ 784.761562] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 784.761562] env[61439]: raise self.value [ 784.761562] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 784.761562] env[61439]: updated_port = self._update_port( [ 784.761562] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 784.761562] env[61439]: _ensure_no_port_binding_failure(port) [ 784.761562] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 784.761562] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 784.762350] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. [ 784.762350] env[61439]: Removing descriptor: 24 [ 784.762350] env[61439]: ERROR nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Traceback (most recent call last): [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] yield resources [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self.driver.spawn(context, instance, image_meta, [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 784.762350] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] vm_ref = self.build_virtual_machine(instance, [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] vif_infos = vmwarevif.get_vif_info(self._session, [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] for vif in network_info: [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] return self._sync_wrapper(fn, *args, **kwargs) [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self.wait() [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self[:] = self._gt.wait() [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] return self._exit_event.wait() [ 784.762700] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] result = hub.switch() [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] return self.greenlet.switch() [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] result = function(*args, **kwargs) [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] return func(*args, **kwargs) [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] raise e [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] nwinfo = self.network_api.allocate_for_instance( [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 784.763040] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] created_port_ids = self._update_ports_for_instance( [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] with excutils.save_and_reraise_exception(): [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self.force_reraise() [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] raise self.value [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] updated_port = self._update_port( [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] _ensure_no_port_binding_failure(port) [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 784.763535] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] raise exception.PortBindingFailed(port_id=port['id']) [ 784.763850] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] nova.exception.PortBindingFailed: Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. [ 784.763850] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] [ 784.763850] env[61439]: INFO nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Terminating instance [ 784.763850] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 784.763850] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 784.763850] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 784.831275] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 784.963230] env[61439]: DEBUG nova.policy [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f0dc659b9ef04b0e9cdb22b584af3bbe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ea05f21f7ce43ad9d22fa29068b4c8d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 785.361771] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 785.378191] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 785.378191] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 785.378191] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 785.378912] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-68a12157-0a69-4ada-97e5-1fe1284014c5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 785.391439] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c00a609-1656-4934-b625-e530ecaa8237 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 785.426724] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c5e39fc1-989c-450b-9211-06cdd39700e6 could not be found. [ 785.426966] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 785.427280] env[61439]: INFO nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Took 0.05 seconds to destroy the instance on the hypervisor. [ 785.430528] env[61439]: DEBUG oslo.service.loopingcall [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 785.430528] env[61439]: DEBUG nova.compute.manager [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 785.430528] env[61439]: DEBUG nova.network.neutron [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 785.486053] env[61439]: DEBUG nova.network.neutron [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 785.497714] env[61439]: DEBUG nova.network.neutron [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 785.509553] env[61439]: INFO nova.compute.manager [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Took 0.08 seconds to deallocate network for instance. [ 785.512959] env[61439]: DEBUG nova.compute.claims [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 785.513400] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 785.513686] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 785.720448] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91da1f32-cd96-4f5f-a61e-4a8d23499a2d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 785.728457] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b28c27d-bef3-4409-b21e-7158775b7d91 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 785.764332] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8b09d5-b994-4a84-934b-a4e7636a5ee7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 785.772886] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26f39aff-fc60-440b-bb1d-6e63ed0e45a1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 785.788435] env[61439]: DEBUG nova.compute.provider_tree [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 785.802284] env[61439]: DEBUG nova.scheduler.client.report [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 785.820660] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.307s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 785.821317] env[61439]: ERROR nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Traceback (most recent call last): [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self.driver.spawn(context, instance, image_meta, [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] vm_ref = self.build_virtual_machine(instance, [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] vif_infos = vmwarevif.get_vif_info(self._session, [ 785.821317] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] for vif in network_info: [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] return self._sync_wrapper(fn, *args, **kwargs) [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self.wait() [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self[:] = self._gt.wait() [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] return self._exit_event.wait() [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] result = hub.switch() [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 785.821731] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] return self.greenlet.switch() [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] result = function(*args, **kwargs) [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] return func(*args, **kwargs) [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] raise e [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] nwinfo = self.network_api.allocate_for_instance( [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] created_port_ids = self._update_ports_for_instance( [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] with excutils.save_and_reraise_exception(): [ 785.822149] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] self.force_reraise() [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] raise self.value [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] updated_port = self._update_port( [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] _ensure_no_port_binding_failure(port) [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] raise exception.PortBindingFailed(port_id=port['id']) [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] nova.exception.PortBindingFailed: Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. [ 785.822538] env[61439]: ERROR nova.compute.manager [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] [ 785.822856] env[61439]: DEBUG nova.compute.utils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 785.824080] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Build of instance c5e39fc1-989c-450b-9211-06cdd39700e6 was re-scheduled: Binding failed for port 10e6e847-4841-4451-bb83-a18e31397cf3, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 785.824519] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 785.824783] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 785.824938] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 785.825121] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 786.047198] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 786.319649] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 786.319839] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 786.330477] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 786.401354] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 786.401615] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 786.404354] env[61439]: INFO nova.compute.claims [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 786.408997] env[61439]: WARNING oslo_vmware.rw_handles [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 786.408997] env[61439]: ERROR oslo_vmware.rw_handles [ 786.409625] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 786.411389] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 786.411630] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Copying Virtual Disk [datastore2] vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/99326781-d3c8-4ba6-af81-b104ad181019/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 786.411891] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9e565db3-ed80-4de4-bd77-bf5fdd77f45b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 786.419947] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for the task: (returnval){ [ 786.419947] env[61439]: value = "task-987683" [ 786.419947] env[61439]: _type = "Task" [ 786.419947] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 786.427930] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Task: {'id': task-987683, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 786.436542] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 786.448641] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 786.448966] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 786.449209] env[61439]: DEBUG nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 786.449975] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 786.505981] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 786.519411] env[61439]: DEBUG nova.network.neutron [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 786.536798] env[61439]: INFO nova.compute.manager [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Took 0.09 seconds to deallocate network for instance. [ 786.663959] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a89d7155-7f50-4e93-b614-fc7620853f19 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 786.675881] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-311f6980-039b-483c-8007-c401ad626db0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 786.712267] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ad5abe8-5698-432e-bf8d-2624ab0dea28 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 786.723781] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94104d52-3aea-4b57-9de5-7a8c047986ca {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 786.741525] env[61439]: DEBUG nova.compute.provider_tree [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 786.771074] env[61439]: DEBUG nova.scheduler.client.report [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 786.786676] env[61439]: INFO nova.scheduler.client.report [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleted allocations for instance c5e39fc1-989c-450b-9211-06cdd39700e6 [ 786.795286] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.393s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 786.796917] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 786.815595] env[61439]: DEBUG oslo_concurrency.lockutils [None req-126784f4-d1e9-48db-9ead-7b67112b03c7 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "c5e39fc1-989c-450b-9211-06cdd39700e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.401s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 786.817108] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "c5e39fc1-989c-450b-9211-06cdd39700e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 6.536s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 786.817660] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "c5e39fc1-989c-450b-9211-06cdd39700e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 786.819189] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "c5e39fc1-989c-450b-9211-06cdd39700e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 786.819189] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "c5e39fc1-989c-450b-9211-06cdd39700e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 786.822558] env[61439]: INFO nova.compute.manager [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Terminating instance [ 786.827013] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 786.827230] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 786.827436] env[61439]: DEBUG nova.network.neutron [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 786.850042] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Successfully created port: 09222fe3-a429-48de-868a-12985c3507fc {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 786.865172] env[61439]: DEBUG nova.compute.utils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 786.866533] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 786.866707] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 786.883397] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 786.892202] env[61439]: DEBUG nova.network.neutron [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 786.934244] env[61439]: DEBUG oslo_vmware.exceptions [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 786.935369] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 786.936842] env[61439]: ERROR nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 786.936842] env[61439]: Faults: ['InvalidArgument'] [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Traceback (most recent call last): [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] yield resources [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] self.driver.spawn(context, instance, image_meta, [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] self._vmops.spawn(context, instance, image_meta, injected_files, [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] self._fetch_image_if_missing(context, vi) [ 786.936842] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] image_cache(vi, tmp_image_ds_loc) [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] vm_util.copy_virtual_disk( [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] session._wait_for_task(vmdk_copy_task) [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] return self.wait_for_task(task_ref) [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] return evt.wait() [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] result = hub.switch() [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 786.937248] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] return self.greenlet.switch() [ 786.937638] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 786.937638] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] self.f(*self.args, **self.kw) [ 786.937638] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 786.937638] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] raise exceptions.translate_fault(task_info.error) [ 786.937638] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 786.937638] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Faults: ['InvalidArgument'] [ 786.937638] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] [ 786.937638] env[61439]: INFO nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Terminating instance [ 786.943017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "refresh_cache-8fe2bccd-5b46-4067-b72b-bdbf726c0155" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 786.943017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired lock "refresh_cache-8fe2bccd-5b46-4067-b72b-bdbf726c0155" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 786.943017] env[61439]: DEBUG nova.network.neutron [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 786.943472] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 786.943682] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 786.944451] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-325db42e-e664-4bcc-bc28-7e7c449f42a1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 786.953912] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 786.954145] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 786.957140] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bab0ab3a-4310-4168-b9c4-5a3bedee710e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 786.967218] env[61439]: DEBUG oslo_vmware.api [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for the task: (returnval){ [ 786.967218] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52bb7310-28f1-d18b-8aa4-bd2d888ca119" [ 786.967218] env[61439]: _type = "Task" [ 786.967218] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 786.980250] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 786.980602] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Creating directory with path [datastore2] vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 786.981785] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-53aa273f-2e89-49a5-8744-a82d01444f93 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 786.987784] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 787.003618] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Created directory with path [datastore2] vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 787.003849] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Fetch image to [datastore2] vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 787.004090] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 787.004872] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdffa5d6-cea8-424c-8333-1f6db9996289 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.013967] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3226b87-8957-4ef2-8594-82d1b45b5871 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.018695] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 787.018845] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 787.019627] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 787.019627] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 787.019627] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 787.019627] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 787.019980] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 787.019980] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 787.020059] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 787.020171] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 787.020714] env[61439]: DEBUG nova.virt.hardware [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 787.021157] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7ce0646-b750-462f-acf3-61eb5fb4ebc6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.033798] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7496f27b-ac93-4e6a-8d08-2744c24e746e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.039391] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e194488-03ef-4256-81d6-71a7b0ec51d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.044467] env[61439]: DEBUG nova.network.neutron [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 787.087325] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce331f07-12b8-44a6-92b1-0c84b614d066 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.096075] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8ac7a8de-ff3e-423a-86d8-ae98996f1673 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.126917] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 787.153568] env[61439]: DEBUG nova.policy [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c468b82bd9d64e19b419a393fff4af06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f86a86563cc047459d3e7c0553c82c63', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 787.197232] env[61439]: DEBUG oslo_vmware.rw_handles [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 787.260180] env[61439]: DEBUG oslo_vmware.rw_handles [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 787.260401] env[61439]: DEBUG oslo_vmware.rw_handles [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 787.561908] env[61439]: DEBUG nova.network.neutron [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 787.575405] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Releasing lock "refresh_cache-8fe2bccd-5b46-4067-b72b-bdbf726c0155" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 787.575932] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 787.576184] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 787.577398] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bba4ed4-9d0e-42c9-b95a-0e07a1bedb9f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.587999] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 787.587999] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-884e1780-1664-4de9-aba5-c75c6b1c8a81 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.603573] env[61439]: DEBUG nova.network.neutron [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 787.619252] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 787.619633] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 787.619837] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Deleting the datastore file [datastore2] 8fe2bccd-5b46-4067-b72b-bdbf726c0155 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 787.624028] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8e13e830-45f0-440b-a99e-9e7a0cdc84e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.624819] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-c5e39fc1-989c-450b-9211-06cdd39700e6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 787.625421] env[61439]: DEBUG nova.compute.manager [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 787.626057] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 787.626728] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ba4d78ae-dd5d-4a02-b5e8-fc06e5d44803 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.637578] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-175a22d4-d503-4953-b75f-ca301d962043 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 787.650364] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for the task: (returnval){ [ 787.650364] env[61439]: value = "task-987685" [ 787.650364] env[61439]: _type = "Task" [ 787.650364] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 787.658676] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Task: {'id': task-987685, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 787.664064] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c5e39fc1-989c-450b-9211-06cdd39700e6 could not be found. [ 787.664529] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 787.665080] env[61439]: INFO nova.compute.manager [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 787.665399] env[61439]: DEBUG oslo.service.loopingcall [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 787.665665] env[61439]: DEBUG nova.compute.manager [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 787.665814] env[61439]: DEBUG nova.network.neutron [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 787.740904] env[61439]: DEBUG nova.network.neutron [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 787.750187] env[61439]: DEBUG nova.network.neutron [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 787.767034] env[61439]: INFO nova.compute.manager [-] [instance: c5e39fc1-989c-450b-9211-06cdd39700e6] Took 0.10 seconds to deallocate network for instance. [ 787.943266] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cc0f1c2-9fa7-4567-b231-a8c58208679e tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "c5e39fc1-989c-450b-9211-06cdd39700e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.127s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 788.164057] env[61439]: DEBUG oslo_vmware.api [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Task: {'id': task-987685, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.0377} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 788.164391] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 788.164633] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 788.165194] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 788.165506] env[61439]: INFO nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Took 0.59 seconds to destroy the instance on the hypervisor. [ 788.165800] env[61439]: DEBUG oslo.service.loopingcall [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 788.166063] env[61439]: DEBUG nova.compute.manager [-] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 788.169598] env[61439]: DEBUG nova.compute.claims [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 788.169815] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 788.170082] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 788.336736] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e88fb63c-b86e-4e3c-89de-1d00a539ee2d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.345508] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1a92041-779c-4272-8928-a9c178782ff0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.380249] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-113af884-09e9-4cd8-b49b-f042bfe72827 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.386970] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a408de2-2f97-4365-8946-77515244d9fd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.403566] env[61439]: DEBUG nova.compute.provider_tree [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 788.413052] env[61439]: DEBUG nova.scheduler.client.report [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 788.428411] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.258s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 788.428984] env[61439]: ERROR nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 788.428984] env[61439]: Faults: ['InvalidArgument'] [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Traceback (most recent call last): [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] self.driver.spawn(context, instance, image_meta, [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] self._vmops.spawn(context, instance, image_meta, injected_files, [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] self._fetch_image_if_missing(context, vi) [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] image_cache(vi, tmp_image_ds_loc) [ 788.428984] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] vm_util.copy_virtual_disk( [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] session._wait_for_task(vmdk_copy_task) [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] return self.wait_for_task(task_ref) [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] return evt.wait() [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] result = hub.switch() [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] return self.greenlet.switch() [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 788.429381] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] self.f(*self.args, **self.kw) [ 788.429743] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 788.429743] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] raise exceptions.translate_fault(task_info.error) [ 788.429743] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 788.429743] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Faults: ['InvalidArgument'] [ 788.429743] env[61439]: ERROR nova.compute.manager [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] [ 788.430284] env[61439]: DEBUG nova.compute.utils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 788.431857] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Build of instance 8fe2bccd-5b46-4067-b72b-bdbf726c0155 was re-scheduled: A specified parameter was not correct: fileType [ 788.431857] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 788.432357] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 788.432675] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "refresh_cache-8fe2bccd-5b46-4067-b72b-bdbf726c0155" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 788.432911] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired lock "refresh_cache-8fe2bccd-5b46-4067-b72b-bdbf726c0155" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 788.433165] env[61439]: DEBUG nova.network.neutron [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 788.527168] env[61439]: DEBUG nova.network.neutron [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 788.742934] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Successfully created port: ce73ef47-1792-432a-9ea6-cc8a5709ee84 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 788.855108] env[61439]: DEBUG nova.network.neutron [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 788.864778] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Releasing lock "refresh_cache-8fe2bccd-5b46-4067-b72b-bdbf726c0155" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 788.864992] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 788.865202] env[61439]: DEBUG nova.compute.manager [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 8fe2bccd-5b46-4067-b72b-bdbf726c0155] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 788.996150] env[61439]: INFO nova.scheduler.client.report [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Deleted allocations for instance 8fe2bccd-5b46-4067-b72b-bdbf726c0155 [ 789.002133] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Acquiring lock "405937e2-694a-4e50-bf6a-07e26193de16" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 789.002451] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Lock "405937e2-694a-4e50-bf6a-07e26193de16" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 789.014953] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 789.033406] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9de536da-f3b3-41b0-899c-b62f3494cb53 tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "8fe2bccd-5b46-4067-b72b-bdbf726c0155" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 91.575s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 789.076238] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 789.076547] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 789.078137] env[61439]: INFO nova.compute.claims [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 789.174100] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Successfully created port: 499fe493-f649-4f60-87ff-0031abbd78b7 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 789.277807] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-402fc92d-5d91-40a2-9228-0ef72d1e9b4c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.290116] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d127473-1ea9-4049-a2d3-9696f4c77a1b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.323594] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9278bcf1-908b-4185-b687-c4e06abed1ca {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.331379] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8e45543-7a97-42bb-92c1-21c57f2eecb0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.347429] env[61439]: DEBUG nova.compute.provider_tree [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 789.362290] env[61439]: DEBUG nova.scheduler.client.report [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 789.380739] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 789.381249] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 789.425534] env[61439]: DEBUG nova.compute.utils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 789.427008] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 789.427791] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 789.445149] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 789.537568] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 789.568854] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 789.569130] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 789.569292] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 789.569480] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 789.569623] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 789.569772] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 789.569983] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 789.571396] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 789.571641] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 789.571814] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 789.572494] env[61439]: DEBUG nova.virt.hardware [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 789.573760] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f485657d-6b0a-47bb-84b0-e9815e382c1f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.584081] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1042e13d-98af-4092-bcc9-564e21a14a14 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.834151] env[61439]: ERROR nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. [ 789.834151] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 789.834151] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 789.834151] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 789.834151] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 789.834151] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 789.834151] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 789.834151] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 789.834151] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 789.834151] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 789.834151] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 789.834151] env[61439]: ERROR nova.compute.manager raise self.value [ 789.834151] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 789.834151] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 789.834151] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 789.834151] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 789.834805] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 789.834805] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 789.834805] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. [ 789.834805] env[61439]: ERROR nova.compute.manager [ 789.834805] env[61439]: Traceback (most recent call last): [ 789.834805] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 789.834805] env[61439]: listener.cb(fileno) [ 789.834805] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 789.834805] env[61439]: result = function(*args, **kwargs) [ 789.834805] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 789.834805] env[61439]: return func(*args, **kwargs) [ 789.834805] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 789.834805] env[61439]: raise e [ 789.834805] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 789.834805] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 789.834805] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 789.834805] env[61439]: created_port_ids = self._update_ports_for_instance( [ 789.834805] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 789.834805] env[61439]: with excutils.save_and_reraise_exception(): [ 789.834805] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 789.834805] env[61439]: self.force_reraise() [ 789.834805] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 789.834805] env[61439]: raise self.value [ 789.834805] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 789.834805] env[61439]: updated_port = self._update_port( [ 789.834805] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 789.834805] env[61439]: _ensure_no_port_binding_failure(port) [ 789.834805] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 789.834805] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 789.835731] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. [ 789.835731] env[61439]: Removing descriptor: 21 [ 789.835731] env[61439]: ERROR nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] Traceback (most recent call last): [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] yield resources [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self.driver.spawn(context, instance, image_meta, [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self._vmops.spawn(context, instance, image_meta, injected_files, [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 789.835731] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] vm_ref = self.build_virtual_machine(instance, [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] vif_infos = vmwarevif.get_vif_info(self._session, [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] for vif in network_info: [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] return self._sync_wrapper(fn, *args, **kwargs) [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self.wait() [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self[:] = self._gt.wait() [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] return self._exit_event.wait() [ 789.836109] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] result = hub.switch() [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] return self.greenlet.switch() [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] result = function(*args, **kwargs) [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] return func(*args, **kwargs) [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] raise e [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] nwinfo = self.network_api.allocate_for_instance( [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 789.836491] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] created_port_ids = self._update_ports_for_instance( [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] with excutils.save_and_reraise_exception(): [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self.force_reraise() [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] raise self.value [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] updated_port = self._update_port( [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] _ensure_no_port_binding_failure(port) [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 789.836820] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] raise exception.PortBindingFailed(port_id=port['id']) [ 789.837131] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] nova.exception.PortBindingFailed: Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. [ 789.837131] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] [ 789.837131] env[61439]: INFO nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Terminating instance [ 789.837734] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "refresh_cache-93635056-506d-4672-ba46-d57e919df841" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 789.837907] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquired lock "refresh_cache-93635056-506d-4672-ba46-d57e919df841" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 789.838108] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 789.870756] env[61439]: DEBUG nova.policy [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a264a55330124f6493ff80d4e18caf35', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b0b92a8e08a448da97a2428eee1d46b2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 789.947737] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 790.553797] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 790.567358] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Releasing lock "refresh_cache-93635056-506d-4672-ba46-d57e919df841" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 790.567358] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 790.567358] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 790.567358] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-19082bd7-047e-489b-9e32-6247d6199485 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.577750] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-042b81de-b80f-47a8-a64a-32f420339260 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.607923] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 93635056-506d-4672-ba46-d57e919df841 could not be found. [ 790.608281] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 790.608524] env[61439]: INFO nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Took 0.04 seconds to destroy the instance on the hypervisor. [ 790.608745] env[61439]: DEBUG oslo.service.loopingcall [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 790.609327] env[61439]: DEBUG nova.compute.manager [-] [instance: 93635056-506d-4672-ba46-d57e919df841] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 790.609486] env[61439]: DEBUG nova.network.neutron [-] [instance: 93635056-506d-4672-ba46-d57e919df841] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 790.828147] env[61439]: DEBUG nova.network.neutron [-] [instance: 93635056-506d-4672-ba46-d57e919df841] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 790.840813] env[61439]: DEBUG nova.network.neutron [-] [instance: 93635056-506d-4672-ba46-d57e919df841] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 790.855823] env[61439]: INFO nova.compute.manager [-] [instance: 93635056-506d-4672-ba46-d57e919df841] Took 0.25 seconds to deallocate network for instance. [ 790.858949] env[61439]: DEBUG nova.compute.claims [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 790.858949] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 790.859238] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.064209] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9613a25b-31c9-4c59-8b98-1062f0ce551a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.075192] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d165450b-7c9f-453b-8e8a-c3534da141cd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.107054] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34935003-b53d-4ec4-973d-b5b5343c7c0c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.115298] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c05821e5-099d-4c98-9bb2-090708f6181e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.129108] env[61439]: DEBUG nova.compute.provider_tree [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 791.139038] env[61439]: DEBUG nova.scheduler.client.report [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 791.153689] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.294s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 791.154328] env[61439]: ERROR nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] Traceback (most recent call last): [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self.driver.spawn(context, instance, image_meta, [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self._vmops.spawn(context, instance, image_meta, injected_files, [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] vm_ref = self.build_virtual_machine(instance, [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] vif_infos = vmwarevif.get_vif_info(self._session, [ 791.154328] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] for vif in network_info: [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] return self._sync_wrapper(fn, *args, **kwargs) [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self.wait() [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self[:] = self._gt.wait() [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] return self._exit_event.wait() [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] result = hub.switch() [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 791.154673] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] return self.greenlet.switch() [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] result = function(*args, **kwargs) [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] return func(*args, **kwargs) [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] raise e [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] nwinfo = self.network_api.allocate_for_instance( [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] created_port_ids = self._update_ports_for_instance( [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] with excutils.save_and_reraise_exception(): [ 791.155031] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] self.force_reraise() [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] raise self.value [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] updated_port = self._update_port( [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] _ensure_no_port_binding_failure(port) [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] raise exception.PortBindingFailed(port_id=port['id']) [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] nova.exception.PortBindingFailed: Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. [ 791.155374] env[61439]: ERROR nova.compute.manager [instance: 93635056-506d-4672-ba46-d57e919df841] [ 791.155676] env[61439]: DEBUG nova.compute.utils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 791.156683] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Build of instance 93635056-506d-4672-ba46-d57e919df841 was re-scheduled: Binding failed for port 6d2c454f-bf8b-4baf-9f24-ffd8a6c0609f, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 791.157094] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 791.157650] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "refresh_cache-93635056-506d-4672-ba46-d57e919df841" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 791.157650] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquired lock "refresh_cache-93635056-506d-4672-ba46-d57e919df841" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 791.157650] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 791.208879] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 792.039766] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 792.058026] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Releasing lock "refresh_cache-93635056-506d-4672-ba46-d57e919df841" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 792.058144] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 792.058225] env[61439]: DEBUG nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 792.058477] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 792.152113] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 792.163849] env[61439]: DEBUG nova.network.neutron [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 792.175041] env[61439]: INFO nova.compute.manager [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 93635056-506d-4672-ba46-d57e919df841] Took 0.12 seconds to deallocate network for instance. [ 792.213302] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Successfully created port: 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 792.318442] env[61439]: INFO nova.scheduler.client.report [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Deleted allocations for instance 93635056-506d-4672-ba46-d57e919df841 [ 792.358260] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b3970995-6fe7-42e5-a64c-f4753d03dcdf tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "93635056-506d-4672-ba46-d57e919df841" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.019s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 793.664758] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Acquiring lock "93ade29f-b55a-4a85-85dd-fc05699b3a21" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 793.665655] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Lock "93ade29f-b55a-4a85-85dd-fc05699b3a21" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.695051] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 793.797885] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 793.798197] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.799758] env[61439]: INFO nova.compute.claims [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 794.046579] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66fe9841-103a-444a-8d28-f5ecf9e5e58a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.054878] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-538e3b85-99e1-4679-8c08-e5dbb0b7c73b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.092086] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f93eb82-0eb0-4faa-bac5-aabe2f4fa237 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.100732] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-034adfe6-369e-43c1-8cc3-efe93a2bc7c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.116286] env[61439]: DEBUG nova.compute.provider_tree [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 794.127982] env[61439]: DEBUG nova.scheduler.client.report [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 794.145256] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 794.164238] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Acquiring lock "6796524c-5489-40d9-aae2-5b3665c6ffd3" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 794.166016] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Lock "6796524c-5489-40d9-aae2-5b3665c6ffd3" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 794.170531] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Lock "6796524c-5489-40d9-aae2-5b3665c6ffd3" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.006s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 794.171217] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 794.219027] env[61439]: DEBUG nova.compute.utils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 794.222519] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 794.222753] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 794.236187] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 794.302224] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 794.329414] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 794.329665] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 794.329824] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 794.330012] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 794.330184] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 794.330375] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 794.330593] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 794.330867] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 794.331131] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 794.331311] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 794.331491] env[61439]: DEBUG nova.virt.hardware [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 794.332421] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63fa0415-9795-45f8-bb41-fcf9c99daa1c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.340853] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec6ec5ff-5947-4b28-b624-2b51e62d533c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.666188] env[61439]: DEBUG nova.policy [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2711b8149e0f4877858ce3de2d3cbf50', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f777ddda84f140ab9ff2d486e9db78ca', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 795.111233] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Acquiring lock "ac936bda-d410-4437-866c-9b3a9e04e169" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 795.111859] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Lock "ac936bda-d410-4437-866c-9b3a9e04e169" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 795.129128] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 795.191552] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 795.192446] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 795.195789] env[61439]: INFO nova.compute.claims [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 795.422883] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44b1e579-1127-4d86-aa81-67af0db9b241 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 795.430038] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ad2fc9d-5298-4d35-a652-dc1c708f7284 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 795.463582] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23b0d2cd-d79c-4a66-bf85-f3cd887a0eb3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 795.471912] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-489e379a-1b1d-408b-b5f9-ca81471e9707 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 795.489830] env[61439]: DEBUG nova.compute.provider_tree [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 795.502969] env[61439]: DEBUG nova.scheduler.client.report [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 795.524286] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 795.524805] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 795.593683] env[61439]: DEBUG nova.compute.utils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 795.595601] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 795.598094] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 795.612728] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 795.677503] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "8ab0cb27-1972-4fcb-a549-442e720d872c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 795.677911] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "8ab0cb27-1972-4fcb-a549-442e720d872c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 795.702798] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 795.720763] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 795.752844] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 795.753077] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 795.753077] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 795.753291] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 795.753477] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 795.753637] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 795.754726] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 795.754726] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 795.754726] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 795.756256] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 795.756256] env[61439]: DEBUG nova.virt.hardware [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 795.756256] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b2fe525-19f3-462a-8652-a107b951a34f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 795.771218] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c04248a2-09bd-46fb-b9ec-1649a4cb4ca6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 795.792433] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 795.792700] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 795.794231] env[61439]: INFO nova.compute.claims [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 795.842312] env[61439]: ERROR nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. [ 795.842312] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 795.842312] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 795.842312] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 795.842312] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 795.842312] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 795.842312] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 795.842312] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 795.842312] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 795.842312] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 795.842312] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 795.842312] env[61439]: ERROR nova.compute.manager raise self.value [ 795.842312] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 795.842312] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 795.842312] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 795.842312] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 795.842772] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 795.842772] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 795.842772] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. [ 795.842772] env[61439]: ERROR nova.compute.manager [ 795.842772] env[61439]: Traceback (most recent call last): [ 795.842772] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 795.842772] env[61439]: listener.cb(fileno) [ 795.842772] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 795.842772] env[61439]: result = function(*args, **kwargs) [ 795.842772] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 795.842772] env[61439]: return func(*args, **kwargs) [ 795.842772] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 795.842772] env[61439]: raise e [ 795.842772] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 795.842772] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 795.842772] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 795.842772] env[61439]: created_port_ids = self._update_ports_for_instance( [ 795.842772] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 795.842772] env[61439]: with excutils.save_and_reraise_exception(): [ 795.842772] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 795.842772] env[61439]: self.force_reraise() [ 795.842772] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 795.842772] env[61439]: raise self.value [ 795.842772] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 795.842772] env[61439]: updated_port = self._update_port( [ 795.842772] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 795.842772] env[61439]: _ensure_no_port_binding_failure(port) [ 795.842772] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 795.842772] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 795.843817] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. [ 795.843817] env[61439]: Removing descriptor: 10 [ 795.843817] env[61439]: ERROR nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Traceback (most recent call last): [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] yield resources [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self.driver.spawn(context, instance, image_meta, [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 795.843817] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] vm_ref = self.build_virtual_machine(instance, [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] vif_infos = vmwarevif.get_vif_info(self._session, [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] for vif in network_info: [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] return self._sync_wrapper(fn, *args, **kwargs) [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self.wait() [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self[:] = self._gt.wait() [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] return self._exit_event.wait() [ 795.844210] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] result = hub.switch() [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] return self.greenlet.switch() [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] result = function(*args, **kwargs) [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] return func(*args, **kwargs) [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] raise e [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] nwinfo = self.network_api.allocate_for_instance( [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 795.844544] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] created_port_ids = self._update_ports_for_instance( [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] with excutils.save_and_reraise_exception(): [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self.force_reraise() [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] raise self.value [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] updated_port = self._update_port( [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] _ensure_no_port_binding_failure(port) [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 795.844880] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] raise exception.PortBindingFailed(port_id=port['id']) [ 795.845200] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] nova.exception.PortBindingFailed: Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. [ 795.845200] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] [ 795.845200] env[61439]: INFO nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Terminating instance [ 795.852173] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Acquiring lock "refresh_cache-170dbc1c-e9c8-4b23-a556-c6e212beda24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 795.852371] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Acquired lock "refresh_cache-170dbc1c-e9c8-4b23-a556-c6e212beda24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 795.852551] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 796.037350] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16bd876b-ca93-484f-9533-d1fdcac066cf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.045336] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdf25fcc-a5c2-4af7-9614-bcaef5fd3252 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.079095] env[61439]: DEBUG nova.policy [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f5f04360628402f8040f2f71638f6ef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '865f61767268462f999468ee9c444a6b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 796.082010] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9843fc6-54a9-4e6f-befb-be56c9f45d1b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.090631] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d98f20ac-1be2-4293-b3c5-dc2172d043b3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.110375] env[61439]: DEBUG nova.compute.provider_tree [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 796.119737] env[61439]: DEBUG nova.scheduler.client.report [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 796.127055] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 796.136644] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 796.137145] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 796.176232] env[61439]: DEBUG nova.compute.utils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 796.177604] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 796.177775] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 796.194055] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 796.285602] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 796.320849] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 796.320849] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 796.321054] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 796.321249] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 796.321409] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 796.321779] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 796.322823] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 796.322823] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 796.322823] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 796.322823] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 796.322823] env[61439]: DEBUG nova.virt.hardware [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 796.324382] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0923a94-826b-4274-97c0-bc945e8c4c50 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.333062] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0e95747-f2c3-4dea-a78f-a9943a3ad0ac {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.371117] env[61439]: DEBUG nova.policy [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af2fd8431af45ca891f744f4d10b54f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca364a2df93a424f8b66ee39d9b0b120', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 796.537046] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 796.550352] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Releasing lock "refresh_cache-170dbc1c-e9c8-4b23-a556-c6e212beda24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 796.550793] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 796.550981] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 796.551586] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ff9366db-152c-43b0-9673-65ec9b6dafdd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.564508] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-579c4502-288b-4ec7-bef2-bd12cf4ff7cc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.591630] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 170dbc1c-e9c8-4b23-a556-c6e212beda24 could not be found. [ 796.591899] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 796.592139] env[61439]: INFO nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Took 0.04 seconds to destroy the instance on the hypervisor. [ 796.592588] env[61439]: DEBUG oslo.service.loopingcall [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 796.592888] env[61439]: DEBUG nova.compute.manager [-] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 796.592988] env[61439]: DEBUG nova.network.neutron [-] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 796.671893] env[61439]: DEBUG nova.network.neutron [-] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 796.673459] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "974c8b62-f1ce-491a-a646-b091d3af2bb3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 796.674050] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "974c8b62-f1ce-491a-a646-b091d3af2bb3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 796.680825] env[61439]: DEBUG nova.network.neutron [-] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 796.691595] env[61439]: INFO nova.compute.manager [-] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Took 0.10 seconds to deallocate network for instance. [ 796.701019] env[61439]: DEBUG nova.compute.claims [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 796.701019] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 796.701019] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 796.885721] env[61439]: ERROR nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. [ 796.885721] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 796.885721] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 796.885721] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 796.885721] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 796.885721] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 796.885721] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 796.885721] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 796.885721] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 796.885721] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 796.885721] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 796.885721] env[61439]: ERROR nova.compute.manager raise self.value [ 796.885721] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 796.885721] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 796.885721] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 796.885721] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 796.886288] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 796.886288] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 796.886288] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. [ 796.886288] env[61439]: ERROR nova.compute.manager [ 796.888384] env[61439]: Traceback (most recent call last): [ 796.888384] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 796.888384] env[61439]: listener.cb(fileno) [ 796.888384] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 796.888384] env[61439]: result = function(*args, **kwargs) [ 796.888384] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 796.888384] env[61439]: return func(*args, **kwargs) [ 796.888384] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 796.888384] env[61439]: raise e [ 796.888384] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 796.888384] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 796.888384] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 796.888384] env[61439]: created_port_ids = self._update_ports_for_instance( [ 796.888384] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 796.888384] env[61439]: with excutils.save_and_reraise_exception(): [ 796.888384] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 796.888384] env[61439]: self.force_reraise() [ 796.888384] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 796.888384] env[61439]: raise self.value [ 796.888384] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 796.888384] env[61439]: updated_port = self._update_port( [ 796.888384] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 796.888384] env[61439]: _ensure_no_port_binding_failure(port) [ 796.888384] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 796.888384] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 796.888384] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. [ 796.888384] env[61439]: Removing descriptor: 22 [ 796.890526] env[61439]: ERROR nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Traceback (most recent call last): [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] yield resources [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self.driver.spawn(context, instance, image_meta, [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] vm_ref = self.build_virtual_machine(instance, [ 796.890526] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] vif_infos = vmwarevif.get_vif_info(self._session, [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] for vif in network_info: [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] return self._sync_wrapper(fn, *args, **kwargs) [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self.wait() [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self[:] = self._gt.wait() [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] return self._exit_event.wait() [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 796.892824] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] result = hub.switch() [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] return self.greenlet.switch() [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] result = function(*args, **kwargs) [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] return func(*args, **kwargs) [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] raise e [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] nwinfo = self.network_api.allocate_for_instance( [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] created_port_ids = self._update_ports_for_instance( [ 796.893463] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] with excutils.save_and_reraise_exception(): [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self.force_reraise() [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] raise self.value [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] updated_port = self._update_port( [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] _ensure_no_port_binding_failure(port) [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] raise exception.PortBindingFailed(port_id=port['id']) [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] nova.exception.PortBindingFailed: Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. [ 796.893871] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] [ 796.894243] env[61439]: INFO nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Terminating instance [ 796.894243] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Acquiring lock "refresh_cache-6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 796.894243] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Acquired lock "refresh_cache-6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 796.894243] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 796.976674] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f632ea7b-a893-4df2-bf6e-cf34ca94d0a3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 796.982746] env[61439]: ERROR nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. [ 796.982746] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 796.982746] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 796.982746] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 796.982746] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 796.982746] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 796.982746] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 796.982746] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 796.982746] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 796.982746] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 796.982746] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 796.982746] env[61439]: ERROR nova.compute.manager raise self.value [ 796.982746] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 796.982746] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 796.982746] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 796.982746] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 796.983297] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 796.983297] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 796.983297] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. [ 796.983297] env[61439]: ERROR nova.compute.manager [ 796.983297] env[61439]: Traceback (most recent call last): [ 796.983297] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 796.983297] env[61439]: listener.cb(fileno) [ 796.983297] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 796.983297] env[61439]: result = function(*args, **kwargs) [ 796.983297] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 796.983297] env[61439]: return func(*args, **kwargs) [ 796.983297] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 796.983297] env[61439]: raise e [ 796.983297] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 796.983297] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 796.983297] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 796.983297] env[61439]: created_port_ids = self._update_ports_for_instance( [ 796.983297] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 796.983297] env[61439]: with excutils.save_and_reraise_exception(): [ 796.983297] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 796.983297] env[61439]: self.force_reraise() [ 796.983297] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 796.983297] env[61439]: raise self.value [ 796.983297] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 796.983297] env[61439]: updated_port = self._update_port( [ 796.983297] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 796.983297] env[61439]: _ensure_no_port_binding_failure(port) [ 796.983297] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 796.983297] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 796.984145] env[61439]: nova.exception.PortBindingFailed: Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. [ 796.984145] env[61439]: Removing descriptor: 20 [ 796.984145] env[61439]: ERROR nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Traceback (most recent call last): [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] yield resources [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self.driver.spawn(context, instance, image_meta, [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 796.984145] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] vm_ref = self.build_virtual_machine(instance, [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] vif_infos = vmwarevif.get_vif_info(self._session, [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] for vif in network_info: [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] return self._sync_wrapper(fn, *args, **kwargs) [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self.wait() [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self[:] = self._gt.wait() [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] return self._exit_event.wait() [ 796.984475] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] result = hub.switch() [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] return self.greenlet.switch() [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] result = function(*args, **kwargs) [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] return func(*args, **kwargs) [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] raise e [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] nwinfo = self.network_api.allocate_for_instance( [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 796.984822] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] created_port_ids = self._update_ports_for_instance( [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] with excutils.save_and_reraise_exception(): [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self.force_reraise() [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] raise self.value [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] updated_port = self._update_port( [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] _ensure_no_port_binding_failure(port) [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 796.985176] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] raise exception.PortBindingFailed(port_id=port['id']) [ 796.985490] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] nova.exception.PortBindingFailed: Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. [ 796.985490] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] [ 796.985490] env[61439]: INFO nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Terminating instance [ 796.986224] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 796.989389] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "refresh_cache-5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 796.990272] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquired lock "refresh_cache-5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 796.990272] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 796.995933] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-746981e6-1462-4869-a33b-4fe89fc9cf10 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.035273] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52f764dc-efe4-4d0d-bd11-8fe3c8e91351 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.050046] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-370e96e5-b916-4455-bf1c-140ff7f8ec89 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.065848] env[61439]: DEBUG nova.compute.provider_tree [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 797.074614] env[61439]: DEBUG nova.scheduler.client.report [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 797.100798] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.401s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 797.100798] env[61439]: ERROR nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. [ 797.100798] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Traceback (most recent call last): [ 797.100798] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 797.100798] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self.driver.spawn(context, instance, image_meta, [ 797.100798] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 797.100798] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 797.100798] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 797.100798] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] vm_ref = self.build_virtual_machine(instance, [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] vif_infos = vmwarevif.get_vif_info(self._session, [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] for vif in network_info: [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] return self._sync_wrapper(fn, *args, **kwargs) [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self.wait() [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self[:] = self._gt.wait() [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] return self._exit_event.wait() [ 797.101128] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] result = hub.switch() [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] return self.greenlet.switch() [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] result = function(*args, **kwargs) [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] return func(*args, **kwargs) [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] raise e [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] nwinfo = self.network_api.allocate_for_instance( [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 797.101465] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] created_port_ids = self._update_ports_for_instance( [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] with excutils.save_and_reraise_exception(): [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] self.force_reraise() [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] raise self.value [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] updated_port = self._update_port( [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] _ensure_no_port_binding_failure(port) [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 797.101981] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] raise exception.PortBindingFailed(port_id=port['id']) [ 797.102360] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] nova.exception.PortBindingFailed: Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. [ 797.102360] env[61439]: ERROR nova.compute.manager [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] [ 797.102360] env[61439]: DEBUG nova.compute.utils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 797.103956] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Build of instance 170dbc1c-e9c8-4b23-a556-c6e212beda24 was re-scheduled: Binding failed for port 09222fe3-a429-48de-868a-12985c3507fc, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 797.104918] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 797.104918] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Acquiring lock "refresh_cache-170dbc1c-e9c8-4b23-a556-c6e212beda24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 797.105381] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Acquired lock "refresh_cache-170dbc1c-e9c8-4b23-a556-c6e212beda24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 797.105581] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 797.172345] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 797.184533] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 797.340886] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 797.358928] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Releasing lock "refresh_cache-6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 797.359380] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 797.359583] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 797.360173] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c4e0ced4-8fff-4aec-a5ec-8bc754f39889 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.373514] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a119e100-a3d9-4b55-aad7-c2b7cc68ea9a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.409703] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2 could not be found. [ 797.409703] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 797.409703] env[61439]: INFO nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Took 0.05 seconds to destroy the instance on the hypervisor. [ 797.409949] env[61439]: DEBUG oslo.service.loopingcall [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 797.410258] env[61439]: DEBUG nova.compute.manager [-] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 797.410330] env[61439]: DEBUG nova.network.neutron [-] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 797.413920] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Successfully created port: e9d024bf-bebd-413f-98c5-7f94c6c1dfef {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 797.664180] env[61439]: DEBUG nova.network.neutron [-] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 797.681384] env[61439]: DEBUG nova.network.neutron [-] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 797.697810] env[61439]: INFO nova.compute.manager [-] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Took 0.28 seconds to deallocate network for instance. [ 797.697810] env[61439]: DEBUG nova.compute.claims [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 797.697810] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 797.699277] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 797.750728] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 797.769171] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Releasing lock "refresh_cache-5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 797.769791] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 797.769791] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 797.772066] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-94b3fde2-2b24-4e6a-86cf-9c66575a0037 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.783511] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e8e828e-759a-46a5-af1a-346483a7ac68 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.811099] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd could not be found. [ 797.811349] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 797.811844] env[61439]: INFO nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 797.812919] env[61439]: DEBUG oslo.service.loopingcall [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 797.814889] env[61439]: DEBUG nova.compute.manager [-] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 797.814998] env[61439]: DEBUG nova.network.neutron [-] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 797.817753] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 797.829524] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Releasing lock "refresh_cache-170dbc1c-e9c8-4b23-a556-c6e212beda24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 797.830034] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 797.830034] env[61439]: DEBUG nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 797.830238] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 797.906690] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 797.916129] env[61439]: DEBUG nova.network.neutron [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 797.931073] env[61439]: INFO nova.compute.manager [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] [instance: 170dbc1c-e9c8-4b23-a556-c6e212beda24] Took 0.10 seconds to deallocate network for instance. [ 798.003086] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4df34a57-ccc6-4fc8-8a89-c505c7da1757 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.015225] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-412c336f-04bb-4115-945e-49589e233a31 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.019163] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Successfully created port: 5db53ed0-af24-4bcb-b026-cc45252750ed {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 798.053177] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-820f6683-67a6-43a5-bfc9-6ea79b5c94dc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.066256] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c03eed6-ab4a-4749-8e16-219687e88778 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.084093] env[61439]: DEBUG nova.compute.provider_tree [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.085226] env[61439]: INFO nova.scheduler.client.report [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Deleted allocations for instance 170dbc1c-e9c8-4b23-a556-c6e212beda24 [ 798.100756] env[61439]: DEBUG nova.scheduler.client.report [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 798.114029] env[61439]: DEBUG oslo_concurrency.lockutils [None req-80b94a18-8757-4aef-8b3e-952e53f2cae4 tempest-ServerActionsTestJSON-700045971 tempest-ServerActionsTestJSON-700045971-project-member] Lock "170dbc1c-e9c8-4b23-a556-c6e212beda24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.127s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 798.130867] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.433s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 798.131778] env[61439]: ERROR nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Traceback (most recent call last): [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self.driver.spawn(context, instance, image_meta, [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] vm_ref = self.build_virtual_machine(instance, [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] vif_infos = vmwarevif.get_vif_info(self._session, [ 798.131778] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] for vif in network_info: [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] return self._sync_wrapper(fn, *args, **kwargs) [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self.wait() [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self[:] = self._gt.wait() [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] return self._exit_event.wait() [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] result = hub.switch() [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 798.132377] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] return self.greenlet.switch() [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] result = function(*args, **kwargs) [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] return func(*args, **kwargs) [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] raise e [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] nwinfo = self.network_api.allocate_for_instance( [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] created_port_ids = self._update_ports_for_instance( [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] with excutils.save_and_reraise_exception(): [ 798.132704] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] self.force_reraise() [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] raise self.value [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] updated_port = self._update_port( [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] _ensure_no_port_binding_failure(port) [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] raise exception.PortBindingFailed(port_id=port['id']) [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] nova.exception.PortBindingFailed: Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. [ 798.133035] env[61439]: ERROR nova.compute.manager [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] [ 798.133369] env[61439]: DEBUG nova.compute.utils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 798.133425] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 798.135832] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Build of instance 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2 was re-scheduled: Binding failed for port 499fe493-f649-4f60-87ff-0031abbd78b7, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 798.136292] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 798.136527] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Acquiring lock "refresh_cache-6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 798.136913] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Acquired lock "refresh_cache-6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 798.136913] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 798.201334] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 798.201422] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 798.206023] env[61439]: INFO nova.compute.claims [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 798.210778] env[61439]: DEBUG nova.network.neutron [-] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 798.218838] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 798.221364] env[61439]: DEBUG nova.network.neutron [-] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.261260] env[61439]: INFO nova.compute.manager [-] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Took 0.45 seconds to deallocate network for instance. [ 798.266936] env[61439]: DEBUG nova.compute.claims [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 798.267860] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 798.452707] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b1e5622-8142-4c6c-9a82-228131432029 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.465530] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29acb30d-abe6-4f32-b506-56242ac8aaba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.506189] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be661bc0-4580-4e95-9559-dfe1e7b0a10c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.510097] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.519572] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b7693ba-67a3-414a-a45a-247ff584f7c9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.526437] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Releasing lock "refresh_cache-6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 798.526695] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 798.527082] env[61439]: DEBUG nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 798.527082] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 798.540836] env[61439]: DEBUG nova.compute.provider_tree [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.553563] env[61439]: DEBUG nova.scheduler.client.report [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 798.571444] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 798.571954] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 798.577166] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.307s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 798.586947] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 798.600679] env[61439]: DEBUG nova.network.neutron [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.632263] env[61439]: INFO nova.compute.manager [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] [instance: 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2] Took 0.10 seconds to deallocate network for instance. [ 798.640818] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Successfully created port: f18a0e9f-1571-438a-8dba-e8294e47767b {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 798.644867] env[61439]: DEBUG nova.compute.utils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 798.646570] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 798.646808] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 798.678311] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 798.780071] env[61439]: INFO nova.scheduler.client.report [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Deleted allocations for instance 6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2 [ 798.789143] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 798.803885] env[61439]: DEBUG oslo_concurrency.lockutils [None req-5e7d66fc-399a-45b5-85b0-741276e198e7 tempest-ServerDiagnosticsTest-672594855 tempest-ServerDiagnosticsTest-672594855-project-member] Lock "6f5a2f67-d112-4e97-bf4c-1ee5fea5c8b2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.818s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 798.831809] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 798.831809] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 798.831809] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 798.832320] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 798.832320] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 798.832320] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 798.832320] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 798.832320] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 798.832546] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 798.832887] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 798.832887] env[61439]: DEBUG nova.virt.hardware [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 798.837374] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd9a8246-337d-4364-b419-71fff1a0a0fa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.845783] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b3a7fde-7c90-4565-867b-1f55333c0d7f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.879545] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-961247a3-1188-4c47-8e0d-f075edd3f67b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.887244] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7fe0b8b-25d2-4144-aa53-905636f17318 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.917699] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34abdc7b-4d79-41f2-8fcd-20f21bcdb99d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.927649] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74daa998-4048-4806-8a11-43db3e63b79d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.941910] env[61439]: DEBUG nova.compute.provider_tree [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.954413] env[61439]: DEBUG nova.scheduler.client.report [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 798.974942] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.400s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 798.975994] env[61439]: ERROR nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Traceback (most recent call last): [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self.driver.spawn(context, instance, image_meta, [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] vm_ref = self.build_virtual_machine(instance, [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] vif_infos = vmwarevif.get_vif_info(self._session, [ 798.975994] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] for vif in network_info: [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] return self._sync_wrapper(fn, *args, **kwargs) [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self.wait() [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self[:] = self._gt.wait() [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] return self._exit_event.wait() [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] result = hub.switch() [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 798.976380] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] return self.greenlet.switch() [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] result = function(*args, **kwargs) [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] return func(*args, **kwargs) [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] raise e [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] nwinfo = self.network_api.allocate_for_instance( [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] created_port_ids = self._update_ports_for_instance( [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] with excutils.save_and_reraise_exception(): [ 798.976744] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] self.force_reraise() [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] raise self.value [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] updated_port = self._update_port( [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] _ensure_no_port_binding_failure(port) [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] raise exception.PortBindingFailed(port_id=port['id']) [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] nova.exception.PortBindingFailed: Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. [ 798.977567] env[61439]: ERROR nova.compute.manager [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] [ 798.977902] env[61439]: DEBUG nova.compute.utils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 798.979745] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Build of instance 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd was re-scheduled: Binding failed for port ce73ef47-1792-432a-9ea6-cc8a5709ee84, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 798.979745] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 798.979745] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "refresh_cache-5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 798.979745] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquired lock "refresh_cache-5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 798.979993] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 799.070137] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 799.168156] env[61439]: DEBUG nova.policy [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b861ada4972f4431b0b9bd46ae21f7cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16074166244d449b99488fc24f4f3d74', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 799.741260] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Acquiring lock "2e7a252e-071f-48bb-91ae-f4bdd7907059" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 799.741502] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Lock "2e7a252e-071f-48bb-91ae-f4bdd7907059" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 799.752049] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 799.810041] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 799.810342] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 799.811752] env[61439]: INFO nova.compute.claims [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 799.880972] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 799.892117] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Releasing lock "refresh_cache-5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 799.892393] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 799.892587] env[61439]: DEBUG nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 799.892756] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 799.968115] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 799.978820] env[61439]: DEBUG nova.network.neutron [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 799.987884] env[61439]: INFO nova.compute.manager [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd] Took 0.09 seconds to deallocate network for instance. [ 800.015745] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09f64dad-db6c-4809-b0d7-02b9accd33c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.023836] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c101831-1144-4aa9-a3ec-f88064c43b08 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.059754] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f5cebf8-589e-4e59-9946-81e7a200c9a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.075106] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1688513-7c6b-4438-921d-941b1a19ad74 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.091235] env[61439]: DEBUG nova.compute.provider_tree [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 800.097365] env[61439]: INFO nova.scheduler.client.report [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Deleted allocations for instance 5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd [ 800.104507] env[61439]: DEBUG nova.scheduler.client.report [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 800.116143] env[61439]: DEBUG oslo_concurrency.lockutils [None req-87a4f0fb-7aad-40f2-85d1-9b3f95f9975e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "5f325b13-cd19-4cb1-a9f2-9d9e6942f9bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.796s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.130661] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.131180] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 800.166149] env[61439]: DEBUG nova.compute.utils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 800.167447] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 800.167726] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 800.182052] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 800.258412] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 800.283769] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 800.283966] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 800.284137] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 800.284321] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 800.284471] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 800.284643] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 800.284880] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 800.285154] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 800.285374] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 800.285615] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 800.285798] env[61439]: DEBUG nova.virt.hardware [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 800.286970] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d6b9298-d5ac-4334-a3de-481e362975df {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.295706] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f2559dc-9ea6-452a-8501-99ee2e97a73b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.661219] env[61439]: DEBUG nova.policy [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d021d8b7cf444b19d9008f370d9f195', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd210a18ebdcd4c85a520ad1087664bea', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 800.970473] env[61439]: ERROR nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. [ 800.970473] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 800.970473] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 800.970473] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 800.970473] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 800.970473] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 800.970473] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 800.970473] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 800.970473] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 800.970473] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 800.970473] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 800.970473] env[61439]: ERROR nova.compute.manager raise self.value [ 800.970473] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 800.970473] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 800.970473] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 800.970473] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 800.971130] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 800.971130] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 800.971130] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. [ 800.971130] env[61439]: ERROR nova.compute.manager [ 800.971130] env[61439]: Traceback (most recent call last): [ 800.971130] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 800.971130] env[61439]: listener.cb(fileno) [ 800.971130] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 800.971130] env[61439]: result = function(*args, **kwargs) [ 800.971130] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 800.971130] env[61439]: return func(*args, **kwargs) [ 800.971130] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 800.971130] env[61439]: raise e [ 800.971130] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 800.971130] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 800.971130] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 800.971130] env[61439]: created_port_ids = self._update_ports_for_instance( [ 800.971130] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 800.971130] env[61439]: with excutils.save_and_reraise_exception(): [ 800.971130] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 800.971130] env[61439]: self.force_reraise() [ 800.971130] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 800.971130] env[61439]: raise self.value [ 800.971130] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 800.971130] env[61439]: updated_port = self._update_port( [ 800.971130] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 800.971130] env[61439]: _ensure_no_port_binding_failure(port) [ 800.971130] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 800.971130] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 800.971948] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. [ 800.971948] env[61439]: Removing descriptor: 24 [ 800.972096] env[61439]: ERROR nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Traceback (most recent call last): [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] yield resources [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self.driver.spawn(context, instance, image_meta, [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self._vmops.spawn(context, instance, image_meta, injected_files, [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] vm_ref = self.build_virtual_machine(instance, [ 800.972096] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] vif_infos = vmwarevif.get_vif_info(self._session, [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] for vif in network_info: [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] return self._sync_wrapper(fn, *args, **kwargs) [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self.wait() [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self[:] = self._gt.wait() [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] return self._exit_event.wait() [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 800.972533] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] result = hub.switch() [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] return self.greenlet.switch() [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] result = function(*args, **kwargs) [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] return func(*args, **kwargs) [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] raise e [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] nwinfo = self.network_api.allocate_for_instance( [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] created_port_ids = self._update_ports_for_instance( [ 800.972851] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] with excutils.save_and_reraise_exception(): [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self.force_reraise() [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] raise self.value [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] updated_port = self._update_port( [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] _ensure_no_port_binding_failure(port) [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] raise exception.PortBindingFailed(port_id=port['id']) [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] nova.exception.PortBindingFailed: Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. [ 800.973191] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] [ 800.973540] env[61439]: INFO nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Terminating instance [ 800.977249] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Acquiring lock "refresh_cache-405937e2-694a-4e50-bf6a-07e26193de16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 800.977249] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Acquired lock "refresh_cache-405937e2-694a-4e50-bf6a-07e26193de16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 800.977249] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 801.042109] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 801.268625] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Successfully created port: 552cc4f1-2181-4a39-8014-88c9feb747d9 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 801.724933] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.739020] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Releasing lock "refresh_cache-405937e2-694a-4e50-bf6a-07e26193de16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 801.739020] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 801.739020] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 801.739020] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8c1167c1-ccb2-4eb6-a108-55f810f4d6eb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.749774] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-925f131c-6946-47fb-a1cc-d00967217980 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.778507] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 405937e2-694a-4e50-bf6a-07e26193de16 could not be found. [ 801.778808] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 801.779160] env[61439]: INFO nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Took 0.04 seconds to destroy the instance on the hypervisor. [ 801.779466] env[61439]: DEBUG oslo.service.loopingcall [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 801.780056] env[61439]: DEBUG nova.compute.manager [-] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 801.780267] env[61439]: DEBUG nova.network.neutron [-] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 801.841949] env[61439]: DEBUG nova.network.neutron [-] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 801.853745] env[61439]: DEBUG nova.network.neutron [-] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.863140] env[61439]: INFO nova.compute.manager [-] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Took 0.08 seconds to deallocate network for instance. [ 801.865550] env[61439]: DEBUG nova.compute.claims [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 801.865746] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 801.865965] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 802.107823] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c72552a-3c55-4897-a8f9-ac705699d8c1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.118708] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2653b031-3edf-4be2-a52c-0d924afb71ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.158475] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a4966e3-a69f-414b-b085-a8a35b35d059 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.168107] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c96d498e-9ea8-4f40-bb05-55b6d2cbc19e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.184178] env[61439]: DEBUG nova.compute.provider_tree [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 802.197013] env[61439]: DEBUG nova.scheduler.client.report [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 802.221501] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.355s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 802.224386] env[61439]: ERROR nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Traceback (most recent call last): [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self.driver.spawn(context, instance, image_meta, [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self._vmops.spawn(context, instance, image_meta, injected_files, [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] vm_ref = self.build_virtual_machine(instance, [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] vif_infos = vmwarevif.get_vif_info(self._session, [ 802.224386] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] for vif in network_info: [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] return self._sync_wrapper(fn, *args, **kwargs) [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self.wait() [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self[:] = self._gt.wait() [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] return self._exit_event.wait() [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] result = hub.switch() [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 802.224759] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] return self.greenlet.switch() [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] result = function(*args, **kwargs) [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] return func(*args, **kwargs) [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] raise e [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] nwinfo = self.network_api.allocate_for_instance( [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] created_port_ids = self._update_ports_for_instance( [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] with excutils.save_and_reraise_exception(): [ 802.225122] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] self.force_reraise() [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] raise self.value [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] updated_port = self._update_port( [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] _ensure_no_port_binding_failure(port) [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] raise exception.PortBindingFailed(port_id=port['id']) [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] nova.exception.PortBindingFailed: Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. [ 802.225469] env[61439]: ERROR nova.compute.manager [instance: 405937e2-694a-4e50-bf6a-07e26193de16] [ 802.225768] env[61439]: DEBUG nova.compute.utils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 802.225768] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Build of instance 405937e2-694a-4e50-bf6a-07e26193de16 was re-scheduled: Binding failed for port 4a5edbdb-c6ce-48e6-bb8e-3f88f5501ddc, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 802.225875] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 802.226337] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Acquiring lock "refresh_cache-405937e2-694a-4e50-bf6a-07e26193de16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 802.226337] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Acquired lock "refresh_cache-405937e2-694a-4e50-bf6a-07e26193de16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 802.226470] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 802.314730] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 802.362363] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Successfully created port: ace1f76f-cb81-4243-85fd-09aa9ab57d92 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 803.025291] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 803.039469] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Releasing lock "refresh_cache-405937e2-694a-4e50-bf6a-07e26193de16" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 803.039469] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 803.039469] env[61439]: DEBUG nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 803.039469] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 803.147661] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 803.154987] env[61439]: DEBUG nova.network.neutron [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 803.171428] env[61439]: INFO nova.compute.manager [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] [instance: 405937e2-694a-4e50-bf6a-07e26193de16] Took 0.13 seconds to deallocate network for instance. [ 803.303931] env[61439]: INFO nova.scheduler.client.report [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Deleted allocations for instance 405937e2-694a-4e50-bf6a-07e26193de16 [ 803.333979] env[61439]: DEBUG oslo_concurrency.lockutils [None req-91dd8bef-65a0-451f-9338-59381431f1ea tempest-ServersNegativeTestMultiTenantJSON-2038123386 tempest-ServersNegativeTestMultiTenantJSON-2038123386-project-member] Lock "405937e2-694a-4e50-bf6a-07e26193de16" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.331s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 805.210800] env[61439]: ERROR nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. [ 805.210800] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 805.210800] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 805.210800] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 805.210800] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 805.210800] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 805.210800] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 805.210800] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 805.210800] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 805.210800] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 805.210800] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 805.210800] env[61439]: ERROR nova.compute.manager raise self.value [ 805.210800] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 805.210800] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 805.210800] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 805.210800] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 805.212726] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 805.212726] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 805.212726] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. [ 805.212726] env[61439]: ERROR nova.compute.manager [ 805.212726] env[61439]: Traceback (most recent call last): [ 805.212726] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 805.212726] env[61439]: listener.cb(fileno) [ 805.212726] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 805.212726] env[61439]: result = function(*args, **kwargs) [ 805.212726] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 805.212726] env[61439]: return func(*args, **kwargs) [ 805.212726] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 805.212726] env[61439]: raise e [ 805.212726] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 805.212726] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 805.212726] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 805.212726] env[61439]: created_port_ids = self._update_ports_for_instance( [ 805.212726] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 805.212726] env[61439]: with excutils.save_and_reraise_exception(): [ 805.212726] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 805.212726] env[61439]: self.force_reraise() [ 805.212726] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 805.212726] env[61439]: raise self.value [ 805.212726] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 805.212726] env[61439]: updated_port = self._update_port( [ 805.212726] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 805.212726] env[61439]: _ensure_no_port_binding_failure(port) [ 805.212726] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 805.212726] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 805.218855] env[61439]: nova.exception.PortBindingFailed: Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. [ 805.218855] env[61439]: Removing descriptor: 21 [ 805.218855] env[61439]: ERROR nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Traceback (most recent call last): [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] yield resources [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self.driver.spawn(context, instance, image_meta, [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self._vmops.spawn(context, instance, image_meta, injected_files, [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 805.218855] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] vm_ref = self.build_virtual_machine(instance, [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] vif_infos = vmwarevif.get_vif_info(self._session, [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] for vif in network_info: [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] return self._sync_wrapper(fn, *args, **kwargs) [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self.wait() [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self[:] = self._gt.wait() [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] return self._exit_event.wait() [ 805.222209] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] result = hub.switch() [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] return self.greenlet.switch() [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] result = function(*args, **kwargs) [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] return func(*args, **kwargs) [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] raise e [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] nwinfo = self.network_api.allocate_for_instance( [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 805.222944] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] created_port_ids = self._update_ports_for_instance( [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] with excutils.save_and_reraise_exception(): [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self.force_reraise() [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] raise self.value [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] updated_port = self._update_port( [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] _ensure_no_port_binding_failure(port) [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 805.223338] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] raise exception.PortBindingFailed(port_id=port['id']) [ 805.223727] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] nova.exception.PortBindingFailed: Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. [ 805.223727] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] [ 805.223727] env[61439]: INFO nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Terminating instance [ 805.223727] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Acquiring lock "refresh_cache-93ade29f-b55a-4a85-85dd-fc05699b3a21" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 805.223727] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Acquired lock "refresh_cache-93ade29f-b55a-4a85-85dd-fc05699b3a21" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 805.223727] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 805.280342] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 805.663286] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 805.674296] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Releasing lock "refresh_cache-93ade29f-b55a-4a85-85dd-fc05699b3a21" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 805.674708] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 805.674898] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 805.675479] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8da113ed-c93f-48c8-83ca-4fb6444335d6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 805.693556] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a825c126-c84e-4fea-9fc4-d13e1f91c9b9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 805.722099] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 93ade29f-b55a-4a85-85dd-fc05699b3a21 could not be found. [ 805.722395] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 805.722624] env[61439]: INFO nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Took 0.05 seconds to destroy the instance on the hypervisor. [ 805.722877] env[61439]: DEBUG oslo.service.loopingcall [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 805.723190] env[61439]: DEBUG nova.compute.manager [-] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 805.723297] env[61439]: DEBUG nova.network.neutron [-] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 805.782469] env[61439]: DEBUG nova.network.neutron [-] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 805.792900] env[61439]: DEBUG nova.network.neutron [-] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 805.802920] env[61439]: INFO nova.compute.manager [-] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Took 0.08 seconds to deallocate network for instance. [ 805.805060] env[61439]: DEBUG nova.compute.claims [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 805.805239] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 805.805473] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 805.919898] env[61439]: ERROR nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. [ 805.919898] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 805.919898] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 805.919898] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 805.919898] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 805.919898] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 805.919898] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 805.919898] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 805.919898] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 805.919898] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 805.919898] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 805.919898] env[61439]: ERROR nova.compute.manager raise self.value [ 805.919898] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 805.919898] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 805.919898] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 805.919898] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 805.920424] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 805.920424] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 805.920424] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. [ 805.920424] env[61439]: ERROR nova.compute.manager [ 805.920424] env[61439]: Traceback (most recent call last): [ 805.920424] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 805.920424] env[61439]: listener.cb(fileno) [ 805.920424] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 805.920424] env[61439]: result = function(*args, **kwargs) [ 805.920424] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 805.920424] env[61439]: return func(*args, **kwargs) [ 805.920424] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 805.920424] env[61439]: raise e [ 805.920424] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 805.920424] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 805.920424] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 805.920424] env[61439]: created_port_ids = self._update_ports_for_instance( [ 805.920424] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 805.920424] env[61439]: with excutils.save_and_reraise_exception(): [ 805.920424] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 805.920424] env[61439]: self.force_reraise() [ 805.920424] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 805.920424] env[61439]: raise self.value [ 805.920424] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 805.920424] env[61439]: updated_port = self._update_port( [ 805.920424] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 805.920424] env[61439]: _ensure_no_port_binding_failure(port) [ 805.920424] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 805.920424] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 805.921139] env[61439]: nova.exception.PortBindingFailed: Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. [ 805.921139] env[61439]: Removing descriptor: 23 [ 805.921139] env[61439]: ERROR nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Traceback (most recent call last): [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] yield resources [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self.driver.spawn(context, instance, image_meta, [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self._vmops.spawn(context, instance, image_meta, injected_files, [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 805.921139] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] vm_ref = self.build_virtual_machine(instance, [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] vif_infos = vmwarevif.get_vif_info(self._session, [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] for vif in network_info: [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] return self._sync_wrapper(fn, *args, **kwargs) [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self.wait() [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self[:] = self._gt.wait() [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] return self._exit_event.wait() [ 805.921450] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] result = hub.switch() [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] return self.greenlet.switch() [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] result = function(*args, **kwargs) [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] return func(*args, **kwargs) [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] raise e [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] nwinfo = self.network_api.allocate_for_instance( [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 805.921823] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] created_port_ids = self._update_ports_for_instance( [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] with excutils.save_and_reraise_exception(): [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self.force_reraise() [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] raise self.value [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] updated_port = self._update_port( [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] _ensure_no_port_binding_failure(port) [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 805.922205] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] raise exception.PortBindingFailed(port_id=port['id']) [ 805.922602] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] nova.exception.PortBindingFailed: Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. [ 805.922602] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] [ 805.922602] env[61439]: INFO nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Terminating instance [ 805.923946] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Acquiring lock "refresh_cache-ac936bda-d410-4437-866c-9b3a9e04e169" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 805.923946] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Acquired lock "refresh_cache-ac936bda-d410-4437-866c-9b3a9e04e169" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 805.923946] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 805.966304] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 806.003694] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e44f764-7e49-481e-8e5f-4793ea15f04f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.012842] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b6d4a6d-59d5-40b3-a0fe-40f32eae0ae8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.044008] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25128855-f8b8-4792-885d-f69ab3b084eb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.051922] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9fba5d-23a2-4d75-9ab4-44a491ec85d8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.067561] env[61439]: DEBUG nova.compute.provider_tree [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 806.078512] env[61439]: DEBUG nova.scheduler.client.report [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 806.096798] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.291s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 806.098637] env[61439]: ERROR nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Traceback (most recent call last): [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self.driver.spawn(context, instance, image_meta, [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self._vmops.spawn(context, instance, image_meta, injected_files, [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] vm_ref = self.build_virtual_machine(instance, [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] vif_infos = vmwarevif.get_vif_info(self._session, [ 806.098637] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] for vif in network_info: [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] return self._sync_wrapper(fn, *args, **kwargs) [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self.wait() [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self[:] = self._gt.wait() [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] return self._exit_event.wait() [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] result = hub.switch() [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 806.099245] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] return self.greenlet.switch() [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] result = function(*args, **kwargs) [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] return func(*args, **kwargs) [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] raise e [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] nwinfo = self.network_api.allocate_for_instance( [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] created_port_ids = self._update_ports_for_instance( [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] with excutils.save_and_reraise_exception(): [ 806.099767] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] self.force_reraise() [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] raise self.value [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] updated_port = self._update_port( [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] _ensure_no_port_binding_failure(port) [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] raise exception.PortBindingFailed(port_id=port['id']) [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] nova.exception.PortBindingFailed: Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. [ 806.100314] env[61439]: ERROR nova.compute.manager [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] [ 806.100585] env[61439]: DEBUG nova.compute.utils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 806.100585] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Build of instance 93ade29f-b55a-4a85-85dd-fc05699b3a21 was re-scheduled: Binding failed for port e9d024bf-bebd-413f-98c5-7f94c6c1dfef, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 806.100585] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 806.100585] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Acquiring lock "refresh_cache-93ade29f-b55a-4a85-85dd-fc05699b3a21" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 806.100758] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Acquired lock "refresh_cache-93ade29f-b55a-4a85-85dd-fc05699b3a21" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 806.100758] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 806.331330] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 806.558025] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 806.570491] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Releasing lock "refresh_cache-ac936bda-d410-4437-866c-9b3a9e04e169" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 806.570976] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 806.571090] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 806.571644] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-96e1517d-890c-4b3e-aada-56a533efcebc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.581740] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbc52500-437b-409f-8940-2b1de56be99f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.592873] env[61439]: ERROR nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. [ 806.592873] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 806.592873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 806.592873] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 806.592873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 806.592873] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 806.592873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 806.592873] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 806.592873] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 806.592873] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 806.592873] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 806.592873] env[61439]: ERROR nova.compute.manager raise self.value [ 806.592873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 806.592873] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 806.592873] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 806.592873] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 806.593504] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 806.593504] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 806.593504] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. [ 806.593504] env[61439]: ERROR nova.compute.manager [ 806.593504] env[61439]: Traceback (most recent call last): [ 806.593504] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 806.593504] env[61439]: listener.cb(fileno) [ 806.593504] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 806.593504] env[61439]: result = function(*args, **kwargs) [ 806.593504] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 806.593504] env[61439]: return func(*args, **kwargs) [ 806.593504] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 806.593504] env[61439]: raise e [ 806.593504] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 806.593504] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 806.593504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 806.593504] env[61439]: created_port_ids = self._update_ports_for_instance( [ 806.593504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 806.593504] env[61439]: with excutils.save_and_reraise_exception(): [ 806.593504] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 806.593504] env[61439]: self.force_reraise() [ 806.593504] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 806.593504] env[61439]: raise self.value [ 806.593504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 806.593504] env[61439]: updated_port = self._update_port( [ 806.593504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 806.593504] env[61439]: _ensure_no_port_binding_failure(port) [ 806.593504] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 806.593504] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 806.595892] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. [ 806.595892] env[61439]: Removing descriptor: 18 [ 806.595892] env[61439]: ERROR nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Traceback (most recent call last): [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] yield resources [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self.driver.spawn(context, instance, image_meta, [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 806.595892] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] vm_ref = self.build_virtual_machine(instance, [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] vif_infos = vmwarevif.get_vif_info(self._session, [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] for vif in network_info: [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] return self._sync_wrapper(fn, *args, **kwargs) [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self.wait() [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self[:] = self._gt.wait() [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] return self._exit_event.wait() [ 806.596237] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] result = hub.switch() [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] return self.greenlet.switch() [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] result = function(*args, **kwargs) [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] return func(*args, **kwargs) [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] raise e [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] nwinfo = self.network_api.allocate_for_instance( [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 806.596640] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] created_port_ids = self._update_ports_for_instance( [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] with excutils.save_and_reraise_exception(): [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self.force_reraise() [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] raise self.value [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] updated_port = self._update_port( [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] _ensure_no_port_binding_failure(port) [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 806.596962] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] raise exception.PortBindingFailed(port_id=port['id']) [ 806.597973] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] nova.exception.PortBindingFailed: Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. [ 806.597973] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] [ 806.597973] env[61439]: INFO nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Terminating instance [ 806.597973] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-8ab0cb27-1972-4fcb-a549-442e720d872c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 806.597973] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-8ab0cb27-1972-4fcb-a549-442e720d872c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 806.597973] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 806.610513] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ac936bda-d410-4437-866c-9b3a9e04e169 could not be found. [ 806.611033] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 806.611033] env[61439]: INFO nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Took 0.04 seconds to destroy the instance on the hypervisor. [ 806.611191] env[61439]: DEBUG oslo.service.loopingcall [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 806.612856] env[61439]: DEBUG nova.compute.manager [-] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 806.612856] env[61439]: DEBUG nova.network.neutron [-] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 806.695787] env[61439]: DEBUG nova.network.neutron [-] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 806.697661] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 806.709018] env[61439]: DEBUG nova.network.neutron [-] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 806.719125] env[61439]: INFO nova.compute.manager [-] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Took 0.11 seconds to deallocate network for instance. [ 806.721281] env[61439]: DEBUG nova.compute.claims [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 806.721481] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 806.721708] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 806.755217] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 806.773015] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Releasing lock "refresh_cache-93ade29f-b55a-4a85-85dd-fc05699b3a21" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 806.774043] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 806.774043] env[61439]: DEBUG nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 806.774156] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 806.834880] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 806.845039] env[61439]: DEBUG nova.network.neutron [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 806.858350] env[61439]: INFO nova.compute.manager [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] [instance: 93ade29f-b55a-4a85-85dd-fc05699b3a21] Took 0.08 seconds to deallocate network for instance. [ 806.972896] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ba3b29d-b303-4480-84ce-14146fbb2ab7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.984827] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7277d46-c521-4347-a201-fceaa27758b4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.029153] env[61439]: INFO nova.scheduler.client.report [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Deleted allocations for instance 93ade29f-b55a-4a85-85dd-fc05699b3a21 [ 807.035111] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ef57cc8-ac39-4be7-9766-7f4dd927d3df {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.043661] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.045636] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e21a3faa-5378-4e0b-84f9-fea9a4aa1f89 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.071461] env[61439]: DEBUG nova.compute.provider_tree [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 807.071461] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-8ab0cb27-1972-4fcb-a549-442e720d872c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 807.071461] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 807.071461] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 807.071461] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c82e2b41-65e0-454c-8a6f-e6f8d8990bbc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.080834] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8c1c675-072a-4af7-8aa3-cb6b41d5f29f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.092817] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d42a19c-6e52-47d3-aed2-87814b4d4804 tempest-ServerGroupTestJSON-1745706281 tempest-ServerGroupTestJSON-1745706281-project-member] Lock "93ade29f-b55a-4a85-85dd-fc05699b3a21" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.428s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 807.093751] env[61439]: DEBUG nova.scheduler.client.report [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 807.117053] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8ab0cb27-1972-4fcb-a549-442e720d872c could not be found. [ 807.117053] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 807.117053] env[61439]: INFO nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 807.117053] env[61439]: DEBUG oslo.service.loopingcall [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 807.118165] env[61439]: DEBUG nova.compute.manager [-] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 807.118165] env[61439]: DEBUG nova.network.neutron [-] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 807.120619] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.399s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 807.121292] env[61439]: ERROR nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Traceback (most recent call last): [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self.driver.spawn(context, instance, image_meta, [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self._vmops.spawn(context, instance, image_meta, injected_files, [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] vm_ref = self.build_virtual_machine(instance, [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] vif_infos = vmwarevif.get_vif_info(self._session, [ 807.121292] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] for vif in network_info: [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] return self._sync_wrapper(fn, *args, **kwargs) [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self.wait() [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self[:] = self._gt.wait() [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] return self._exit_event.wait() [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] result = hub.switch() [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 807.121815] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] return self.greenlet.switch() [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] result = function(*args, **kwargs) [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] return func(*args, **kwargs) [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] raise e [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] nwinfo = self.network_api.allocate_for_instance( [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] created_port_ids = self._update_ports_for_instance( [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] with excutils.save_and_reraise_exception(): [ 807.122376] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] self.force_reraise() [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] raise self.value [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] updated_port = self._update_port( [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] _ensure_no_port_binding_failure(port) [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] raise exception.PortBindingFailed(port_id=port['id']) [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] nova.exception.PortBindingFailed: Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. [ 807.126748] env[61439]: ERROR nova.compute.manager [instance: ac936bda-d410-4437-866c-9b3a9e04e169] [ 807.127316] env[61439]: DEBUG nova.compute.utils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 807.127316] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Build of instance ac936bda-d410-4437-866c-9b3a9e04e169 was re-scheduled: Binding failed for port f18a0e9f-1571-438a-8dba-e8294e47767b, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 807.127316] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 807.127316] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Acquiring lock "refresh_cache-ac936bda-d410-4437-866c-9b3a9e04e169" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 807.127453] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Acquired lock "refresh_cache-ac936bda-d410-4437-866c-9b3a9e04e169" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 807.127453] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 807.172105] env[61439]: DEBUG nova.network.neutron [-] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 807.189806] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 807.192618] env[61439]: DEBUG nova.network.neutron [-] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.208022] env[61439]: INFO nova.compute.manager [-] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Took 0.09 seconds to deallocate network for instance. [ 807.210487] env[61439]: DEBUG nova.compute.claims [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 807.210701] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 807.210950] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 807.313849] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Acquiring lock "c30b7561-2c76-4d68-93da-f73e7c6a0ed7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 807.313849] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Lock "c30b7561-2c76-4d68-93da-f73e7c6a0ed7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 807.327390] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 807.400685] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 807.426123] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35f691fb-a6a5-4f8d-afcf-2d6125771648 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.435605] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d55be7c9-3ed4-45f3-a55c-7c85d0326f98 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.440136] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.470905] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Releasing lock "refresh_cache-ac936bda-d410-4437-866c-9b3a9e04e169" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 807.471159] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 807.471347] env[61439]: DEBUG nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 807.471542] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 807.474518] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f8c7b01-7ea9-4675-85ea-4cd5f0ffe0bb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.482616] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73ac4056-b2d9-4204-930a-fc1aaa97bfa4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.498012] env[61439]: DEBUG nova.compute.provider_tree [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 807.500229] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 807.509704] env[61439]: DEBUG nova.scheduler.client.report [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 807.512405] env[61439]: DEBUG nova.network.neutron [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.524219] env[61439]: INFO nova.compute.manager [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] [instance: ac936bda-d410-4437-866c-9b3a9e04e169] Took 0.05 seconds to deallocate network for instance. [ 807.527327] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.316s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 807.527961] env[61439]: ERROR nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Traceback (most recent call last): [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self.driver.spawn(context, instance, image_meta, [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] vm_ref = self.build_virtual_machine(instance, [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] vif_infos = vmwarevif.get_vif_info(self._session, [ 807.527961] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] for vif in network_info: [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] return self._sync_wrapper(fn, *args, **kwargs) [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self.wait() [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self[:] = self._gt.wait() [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] return self._exit_event.wait() [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] result = hub.switch() [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 807.528370] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] return self.greenlet.switch() [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] result = function(*args, **kwargs) [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] return func(*args, **kwargs) [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] raise e [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] nwinfo = self.network_api.allocate_for_instance( [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] created_port_ids = self._update_ports_for_instance( [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] with excutils.save_and_reraise_exception(): [ 807.529564] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] self.force_reraise() [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] raise self.value [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] updated_port = self._update_port( [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] _ensure_no_port_binding_failure(port) [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] raise exception.PortBindingFailed(port_id=port['id']) [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] nova.exception.PortBindingFailed: Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. [ 807.530706] env[61439]: ERROR nova.compute.manager [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] [ 807.531637] env[61439]: DEBUG nova.compute.utils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 807.531637] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.129s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 807.531637] env[61439]: INFO nova.compute.claims [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 807.534754] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Build of instance 8ab0cb27-1972-4fcb-a549-442e720d872c was re-scheduled: Binding failed for port 5db53ed0-af24-4bcb-b026-cc45252750ed, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 807.535608] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 807.535608] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-8ab0cb27-1972-4fcb-a549-442e720d872c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 807.535759] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-8ab0cb27-1972-4fcb-a549-442e720d872c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 807.536016] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 807.567755] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 807.686022] env[61439]: INFO nova.scheduler.client.report [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Deleted allocations for instance ac936bda-d410-4437-866c-9b3a9e04e169 [ 807.692884] env[61439]: ERROR nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. [ 807.692884] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 807.692884] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 807.692884] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 807.692884] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 807.692884] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 807.692884] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 807.692884] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 807.692884] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 807.692884] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 807.692884] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 807.692884] env[61439]: ERROR nova.compute.manager raise self.value [ 807.692884] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 807.692884] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 807.692884] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 807.692884] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 807.693482] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 807.693482] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 807.693482] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. [ 807.693482] env[61439]: ERROR nova.compute.manager [ 807.693482] env[61439]: Traceback (most recent call last): [ 807.693482] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 807.693482] env[61439]: listener.cb(fileno) [ 807.693482] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 807.693482] env[61439]: result = function(*args, **kwargs) [ 807.693482] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 807.693482] env[61439]: return func(*args, **kwargs) [ 807.693482] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 807.693482] env[61439]: raise e [ 807.693482] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 807.693482] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 807.693482] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 807.693482] env[61439]: created_port_ids = self._update_ports_for_instance( [ 807.693482] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 807.693482] env[61439]: with excutils.save_and_reraise_exception(): [ 807.693482] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 807.693482] env[61439]: self.force_reraise() [ 807.693482] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 807.693482] env[61439]: raise self.value [ 807.693482] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 807.693482] env[61439]: updated_port = self._update_port( [ 807.693482] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 807.693482] env[61439]: _ensure_no_port_binding_failure(port) [ 807.693482] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 807.693482] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 807.694533] env[61439]: nova.exception.PortBindingFailed: Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. [ 807.694533] env[61439]: Removing descriptor: 20 [ 807.694533] env[61439]: ERROR nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Traceback (most recent call last): [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] yield resources [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self.driver.spawn(context, instance, image_meta, [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self._vmops.spawn(context, instance, image_meta, injected_files, [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 807.694533] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] vm_ref = self.build_virtual_machine(instance, [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] vif_infos = vmwarevif.get_vif_info(self._session, [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] for vif in network_info: [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] return self._sync_wrapper(fn, *args, **kwargs) [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self.wait() [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self[:] = self._gt.wait() [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] return self._exit_event.wait() [ 807.696674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] result = hub.switch() [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] return self.greenlet.switch() [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] result = function(*args, **kwargs) [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] return func(*args, **kwargs) [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] raise e [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] nwinfo = self.network_api.allocate_for_instance( [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 807.697579] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] created_port_ids = self._update_ports_for_instance( [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] with excutils.save_and_reraise_exception(): [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self.force_reraise() [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] raise self.value [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] updated_port = self._update_port( [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] _ensure_no_port_binding_failure(port) [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 807.699138] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] raise exception.PortBindingFailed(port_id=port['id']) [ 807.699468] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] nova.exception.PortBindingFailed: Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. [ 807.699468] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] [ 807.699468] env[61439]: INFO nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Terminating instance [ 807.699468] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Acquiring lock "refresh_cache-2e7a252e-071f-48bb-91ae-f4bdd7907059" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 807.699468] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Acquired lock "refresh_cache-2e7a252e-071f-48bb-91ae-f4bdd7907059" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 807.699468] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 807.727919] env[61439]: DEBUG oslo_concurrency.lockutils [None req-909192ed-311c-4224-b124-0feb2c19c2da tempest-ServerAddressesTestJSON-1402003061 tempest-ServerAddressesTestJSON-1402003061-project-member] Lock "ac936bda-d410-4437-866c-9b3a9e04e169" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.616s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 807.777046] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 807.798962] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ebc02f-ff1c-401b-b96b-939c26ab43b1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.809508] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ae187e9-9fc3-4937-a0f6-f36a4f8d18b0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.848825] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49a1f5f5-4b3d-4951-9cd8-baa21d80cf06 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.857567] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3d28238-5714-4218-8230-d2013949ec96 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.882290] env[61439]: DEBUG nova.compute.provider_tree [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 807.893879] env[61439]: DEBUG nova.scheduler.client.report [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 807.897786] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.913698] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-8ab0cb27-1972-4fcb-a549-442e720d872c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 807.914651] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 807.914651] env[61439]: DEBUG nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 807.914651] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 807.917861] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 807.921238] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 807.972980] env[61439]: DEBUG nova.compute.utils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 807.976974] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 807.976974] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 807.990067] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 808.095039] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 808.121429] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 808.121670] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 808.121831] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 808.122055] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 808.127167] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 808.127167] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 808.127167] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 808.127167] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 808.127167] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 808.127555] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 808.127555] env[61439]: DEBUG nova.virt.hardware [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 808.127555] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f627dd5-66c6-4e0e-ba1c-f5ce52d327f2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.136979] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c786602a-dff6-4547-b123-c75973dc4f97 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.170520] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Acquiring lock "025992d2-423e-492d-b820-7d9f3554c0f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 808.170520] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Lock "025992d2-423e-492d-b820-7d9f3554c0f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 808.182728] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 808.197421] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 808.201232] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 808.206785] env[61439]: DEBUG nova.network.neutron [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 808.217418] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Releasing lock "refresh_cache-2e7a252e-071f-48bb-91ae-f4bdd7907059" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 808.217825] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 808.219026] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 808.219026] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f3d5e522-d769-46e8-b625-d30f207bb61a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.221847] env[61439]: INFO nova.compute.manager [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8ab0cb27-1972-4fcb-a549-442e720d872c] Took 0.31 seconds to deallocate network for instance. [ 808.232143] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12dd43d5-0497-45c1-84ad-a186dc15ff89 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.262901] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2e7a252e-071f-48bb-91ae-f4bdd7907059 could not be found. [ 808.262901] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 808.262901] env[61439]: INFO nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Took 0.04 seconds to destroy the instance on the hypervisor. [ 808.262901] env[61439]: DEBUG oslo.service.loopingcall [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 808.262901] env[61439]: DEBUG nova.compute.manager [-] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 808.263259] env[61439]: DEBUG nova.network.neutron [-] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 808.276328] env[61439]: DEBUG nova.policy [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bea60c029be640338b24207747cf2c13', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2011298d12d046aebc57dc28f99c9cc6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 808.278057] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 808.278412] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 808.279922] env[61439]: INFO nova.compute.claims [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 808.303165] env[61439]: DEBUG nova.network.neutron [-] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 808.317368] env[61439]: DEBUG nova.network.neutron [-] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 808.347328] env[61439]: INFO nova.compute.manager [-] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Took 0.09 seconds to deallocate network for instance. [ 808.349882] env[61439]: DEBUG nova.compute.claims [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 808.350743] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 808.383850] env[61439]: INFO nova.scheduler.client.report [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleted allocations for instance 8ab0cb27-1972-4fcb-a549-442e720d872c [ 808.421028] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f76b41af-70d3-47af-9aad-0218502ebc35 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "8ab0cb27-1972-4fcb-a549-442e720d872c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.742s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.528371] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88191105-de0a-40ac-b485-eef985f512d0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.546555] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bf5f875-b57c-4a77-b373-947ded8bf80b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.581037] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-257e9467-7b52-44a3-beb3-e6afc2fb081b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.589480] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a30cc429-7dc9-4a80-ac0e-10cc5ff6820a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.605637] env[61439]: DEBUG nova.compute.provider_tree [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 808.620233] env[61439]: DEBUG nova.scheduler.client.report [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 808.640992] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.641716] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 808.644642] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.294s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 808.693440] env[61439]: DEBUG nova.compute.utils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 808.697049] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 808.697480] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 808.707750] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 808.813064] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 808.821695] env[61439]: ERROR nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. [ 808.821695] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 808.821695] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 808.821695] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 808.821695] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 808.821695] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 808.821695] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 808.821695] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 808.821695] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 808.821695] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 808.821695] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 808.821695] env[61439]: ERROR nova.compute.manager raise self.value [ 808.821695] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 808.821695] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 808.821695] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 808.821695] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 808.822339] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 808.822339] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 808.822339] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. [ 808.822339] env[61439]: ERROR nova.compute.manager [ 808.822339] env[61439]: Traceback (most recent call last): [ 808.822339] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 808.822339] env[61439]: listener.cb(fileno) [ 808.822339] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 808.822339] env[61439]: result = function(*args, **kwargs) [ 808.822339] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 808.822339] env[61439]: return func(*args, **kwargs) [ 808.822339] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 808.822339] env[61439]: raise e [ 808.822339] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 808.822339] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 808.822339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 808.822339] env[61439]: created_port_ids = self._update_ports_for_instance( [ 808.822339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 808.822339] env[61439]: with excutils.save_and_reraise_exception(): [ 808.822339] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 808.822339] env[61439]: self.force_reraise() [ 808.822339] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 808.822339] env[61439]: raise self.value [ 808.822339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 808.822339] env[61439]: updated_port = self._update_port( [ 808.822339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 808.822339] env[61439]: _ensure_no_port_binding_failure(port) [ 808.822339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 808.822339] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 808.823289] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. [ 808.823289] env[61439]: Removing descriptor: 10 [ 808.823289] env[61439]: ERROR nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Traceback (most recent call last): [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] yield resources [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self.driver.spawn(context, instance, image_meta, [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 808.823289] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] vm_ref = self.build_virtual_machine(instance, [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] vif_infos = vmwarevif.get_vif_info(self._session, [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] for vif in network_info: [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] return self._sync_wrapper(fn, *args, **kwargs) [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self.wait() [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self[:] = self._gt.wait() [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] return self._exit_event.wait() [ 808.823897] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] result = hub.switch() [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] return self.greenlet.switch() [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] result = function(*args, **kwargs) [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] return func(*args, **kwargs) [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] raise e [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] nwinfo = self.network_api.allocate_for_instance( [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 808.824431] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] created_port_ids = self._update_ports_for_instance( [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] with excutils.save_and_reraise_exception(): [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self.force_reraise() [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] raise self.value [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] updated_port = self._update_port( [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] _ensure_no_port_binding_failure(port) [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 808.824824] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] raise exception.PortBindingFailed(port_id=port['id']) [ 808.825197] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] nova.exception.PortBindingFailed: Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. [ 808.825197] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] [ 808.825197] env[61439]: INFO nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Terminating instance [ 808.831913] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-974c8b62-f1ce-491a-a646-b091d3af2bb3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 808.834596] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-974c8b62-f1ce-491a-a646-b091d3af2bb3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 808.835374] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 808.846758] env[61439]: DEBUG nova.policy [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '58092638c8144ee383fc39898d1291af', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '23a10ccee87e4d6a9959b310e8a58d86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 808.879216] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 808.879768] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 808.879947] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 808.880164] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 808.880313] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 808.880513] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 808.880756] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 808.880959] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 808.881185] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 808.881503] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 808.881788] env[61439]: DEBUG nova.virt.hardware [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 808.882991] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9626cac-17a0-4ed4-8dec-75d6f590d633 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.894313] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68a3532b-f4b3-4dae-b0a0-9b509a1dd139 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.911291] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-864bad63-9e19-4707-91bf-e1ef6e31df50 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.920877] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8d7d4ee-34ff-4170-809f-abbef60bed59 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.955965] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 808.958523] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a0bfc51-a5b0-4e60-b04e-addcc1a805c5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.968669] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b064745-30a5-4247-8f4a-a7d274798b7a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.987881] env[61439]: DEBUG nova.compute.provider_tree [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 808.998170] env[61439]: DEBUG nova.scheduler.client.report [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 809.021241] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.376s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.021957] env[61439]: ERROR nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Traceback (most recent call last): [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self.driver.spawn(context, instance, image_meta, [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self._vmops.spawn(context, instance, image_meta, injected_files, [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] vm_ref = self.build_virtual_machine(instance, [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] vif_infos = vmwarevif.get_vif_info(self._session, [ 809.021957] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] for vif in network_info: [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] return self._sync_wrapper(fn, *args, **kwargs) [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self.wait() [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self[:] = self._gt.wait() [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] return self._exit_event.wait() [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] result = hub.switch() [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 809.022312] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] return self.greenlet.switch() [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] result = function(*args, **kwargs) [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] return func(*args, **kwargs) [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] raise e [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] nwinfo = self.network_api.allocate_for_instance( [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] created_port_ids = self._update_ports_for_instance( [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] with excutils.save_and_reraise_exception(): [ 809.022674] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] self.force_reraise() [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] raise self.value [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] updated_port = self._update_port( [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] _ensure_no_port_binding_failure(port) [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] raise exception.PortBindingFailed(port_id=port['id']) [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] nova.exception.PortBindingFailed: Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. [ 809.023710] env[61439]: ERROR nova.compute.manager [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] [ 809.024019] env[61439]: DEBUG nova.compute.utils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 809.025024] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Build of instance 2e7a252e-071f-48bb-91ae-f4bdd7907059 was re-scheduled: Binding failed for port ace1f76f-cb81-4243-85fd-09aa9ab57d92, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 809.025283] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 809.025511] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Acquiring lock "refresh_cache-2e7a252e-071f-48bb-91ae-f4bdd7907059" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 809.025656] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Acquired lock "refresh_cache-2e7a252e-071f-48bb-91ae-f4bdd7907059" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 809.025812] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 809.072135] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Successfully created port: 81aad203-f976-4c1c-81d7-6848338446d7 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 809.120817] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 809.416186] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 809.433546] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-974c8b62-f1ce-491a-a646-b091d3af2bb3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 809.434065] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 809.434301] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 809.434924] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b51cd3b0-ef2b-4bbe-a11e-c0221a3f30e4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.449844] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77d935d9-287d-4c67-b488-be658a51b7da {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.480820] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 974c8b62-f1ce-491a-a646-b091d3af2bb3 could not be found. [ 809.481061] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 809.481245] env[61439]: INFO nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Took 0.05 seconds to destroy the instance on the hypervisor. [ 809.481504] env[61439]: DEBUG oslo.service.loopingcall [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 809.483349] env[61439]: DEBUG nova.compute.manager [-] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 809.483450] env[61439]: DEBUG nova.network.neutron [-] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 809.488792] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Acquiring lock "de136287-7dfb-4829-aef7-96ae82acfa65" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 809.488792] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Lock "de136287-7dfb-4829-aef7-96ae82acfa65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 809.505488] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.550416] env[61439]: DEBUG nova.network.neutron [-] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 809.567021] env[61439]: DEBUG nova.network.neutron [-] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 809.575063] env[61439]: INFO nova.compute.manager [-] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Took 0.09 seconds to deallocate network for instance. [ 809.577094] env[61439]: DEBUG nova.compute.claims [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 809.577277] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 809.577495] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 809.588166] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 809.760658] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29c21805-4d08-4633-88b1-2fa25ebb5dc2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.769025] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05690344-f4c6-4503-bef7-003fe948cf52 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.802549] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79ebbdf5-d57a-4f15-9cc6-b2f54db1b281 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.810791] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39de98ef-3e37-4871-b9ea-58d565b2aace {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 809.829445] env[61439]: DEBUG nova.compute.provider_tree [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 809.841200] env[61439]: DEBUG nova.scheduler.client.report [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 809.860367] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.283s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.860996] env[61439]: ERROR nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Traceback (most recent call last): [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self.driver.spawn(context, instance, image_meta, [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] vm_ref = self.build_virtual_machine(instance, [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] vif_infos = vmwarevif.get_vif_info(self._session, [ 809.860996] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] for vif in network_info: [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] return self._sync_wrapper(fn, *args, **kwargs) [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self.wait() [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self[:] = self._gt.wait() [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] return self._exit_event.wait() [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] result = hub.switch() [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 809.861498] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] return self.greenlet.switch() [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] result = function(*args, **kwargs) [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] return func(*args, **kwargs) [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] raise e [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] nwinfo = self.network_api.allocate_for_instance( [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] created_port_ids = self._update_ports_for_instance( [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] with excutils.save_and_reraise_exception(): [ 809.862095] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] self.force_reraise() [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] raise self.value [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] updated_port = self._update_port( [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] _ensure_no_port_binding_failure(port) [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] raise exception.PortBindingFailed(port_id=port['id']) [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] nova.exception.PortBindingFailed: Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. [ 809.862989] env[61439]: ERROR nova.compute.manager [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] [ 809.863307] env[61439]: DEBUG nova.compute.utils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 809.863307] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.275s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 809.864364] env[61439]: INFO nova.compute.claims [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 809.870021] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Build of instance 974c8b62-f1ce-491a-a646-b091d3af2bb3 was re-scheduled: Binding failed for port 552cc4f1-2181-4a39-8014-88c9feb747d9, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 809.870021] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 809.870021] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-974c8b62-f1ce-491a-a646-b091d3af2bb3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 809.870021] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-974c8b62-f1ce-491a-a646-b091d3af2bb3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 809.870236] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 809.965590] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 809.982927] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Releasing lock "refresh_cache-2e7a252e-071f-48bb-91ae-f4bdd7907059" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 809.983191] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 809.983377] env[61439]: DEBUG nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 809.983792] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 809.985889] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.041055] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.051956] env[61439]: DEBUG nova.network.neutron [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.058165] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-093a75bd-12af-4945-899b-7fc909091306 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.063105] env[61439]: INFO nova.compute.manager [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] [instance: 2e7a252e-071f-48bb-91ae-f4bdd7907059] Took 0.08 seconds to deallocate network for instance. [ 810.068817] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2867015-a744-4510-a35a-7f0c5a044471 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.109476] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06c93b9c-a6d6-4eef-949a-2ef2c118502f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.120093] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49c725fe-8b8a-4543-9420-bc5c429629e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.135013] env[61439]: DEBUG nova.compute.provider_tree [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 810.143725] env[61439]: DEBUG nova.scheduler.client.report [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 810.159631] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 810.159631] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 810.181236] env[61439]: INFO nova.scheduler.client.report [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Deleted allocations for instance 2e7a252e-071f-48bb-91ae-f4bdd7907059 [ 810.201945] env[61439]: DEBUG nova.compute.utils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 810.203711] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 810.203711] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 810.206962] env[61439]: DEBUG oslo_concurrency.lockutils [None req-a5c8cd26-f0b6-4ca4-b85f-007ffbf842ce tempest-InstanceActionsV221TestJSON-346798859 tempest-InstanceActionsV221TestJSON-346798859-project-member] Lock "2e7a252e-071f-48bb-91ae-f4bdd7907059" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.465s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 810.230739] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 810.311324] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 810.338134] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.350244] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-974c8b62-f1ce-491a-a646-b091d3af2bb3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 810.350244] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 810.350244] env[61439]: DEBUG nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 810.350244] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 810.353983] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 810.354235] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 810.354418] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 810.354670] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 810.354826] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 810.354977] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 810.355223] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 810.355365] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 810.355835] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 810.355835] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 810.355991] env[61439]: DEBUG nova.virt.hardware [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 810.357009] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6502f489-59c4-4087-ac2b-4414e969a18f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.366139] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d6b0b9f-baea-4e43-9475-2248eeb8d9b3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.390923] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.399573] env[61439]: DEBUG nova.network.neutron [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.407543] env[61439]: INFO nova.compute.manager [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 974c8b62-f1ce-491a-a646-b091d3af2bb3] Took 0.06 seconds to deallocate network for instance. [ 810.471184] env[61439]: DEBUG nova.policy [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5e7f8b3201474ff28bf523b845f0bd87', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9a13458250fe40d78872c7d479b16b73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 810.541723] env[61439]: INFO nova.scheduler.client.report [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Deleted allocations for instance 974c8b62-f1ce-491a-a646-b091d3af2bb3 [ 810.582297] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ff715d4f-057b-4e47-b11f-77766dc24abc tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "974c8b62-f1ce-491a-a646-b091d3af2bb3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.908s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 810.856225] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Acquiring lock "282ecbf4-cd05-4ea0-bb3f-708969856b7e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.856491] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Lock "282ecbf4-cd05-4ea0-bb3f-708969856b7e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.872083] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 810.942039] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.942459] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.946117] env[61439]: INFO nova.compute.claims [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 810.968955] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "eec2b74f-9b6c-4566-a0dc-da1fe9578715" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.970139] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "eec2b74f-9b6c-4566-a0dc-da1fe9578715" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.983932] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Successfully created port: 5a95018b-84db-45e5-8b22-d05ea50159a5 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 810.988011] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 811.010818] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "cfda19ad-e831-4018-9228-e96fede0bae6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.011082] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "cfda19ad-e831-4018-9228-e96fede0bae6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 811.032088] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 811.071105] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.094851] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.262374] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-159a1aa6-eef8-4d27-ba2e-af2adeade797 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.271206] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd11da7e-3107-486d-b723-cb64bd41a7d5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.306635] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b2979f2-d90c-4029-9405-22419092b09e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.314713] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-434ebb98-0639-4d5d-a725-2e85e445eebf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.330470] env[61439]: DEBUG nova.compute.provider_tree [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 811.340010] env[61439]: DEBUG nova.scheduler.client.report [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 811.359594] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.417s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 811.360564] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 811.363476] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.293s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 811.364899] env[61439]: INFO nova.compute.claims [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 811.401981] env[61439]: DEBUG nova.compute.utils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 811.403580] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 811.403870] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 811.413275] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 811.495752] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 811.511105] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Acquiring lock "0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.511105] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Lock "0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 811.534376] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 811.534376] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 811.534376] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 811.534818] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 811.534818] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 811.534818] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 811.534818] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 811.534818] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 811.535041] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 811.535041] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 811.535041] env[61439]: DEBUG nova.virt.hardware [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 811.536070] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3cf3748-27db-4417-8e36-79b6b1d80737 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.540707] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 811.551240] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94fe4ef2-7d6e-4b8e-b20e-ffdb37c543bc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.611969] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.696084] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b7bc045-fa69-4a20-ba88-875f1b955fbf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.704043] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f401b47-303a-4620-8bef-20ca776caa39 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.737729] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e9c28f-3210-49fe-9aea-a1faeda56ece {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.746239] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e2bca5b-7b93-4fa8-8c0f-ca94645c0c08 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.762736] env[61439]: DEBUG nova.compute.provider_tree [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 811.772061] env[61439]: DEBUG nova.scheduler.client.report [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 811.791156] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.428s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 811.791732] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 811.794020] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.700s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 811.795351] env[61439]: INFO nova.compute.claims [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 811.810703] env[61439]: DEBUG nova.policy [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6caf6ee2fa984516b47e691afad810b2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9bf4f2e40b3744e68d4740ffe6a94004', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 811.834878] env[61439]: DEBUG nova.compute.utils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 811.836154] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 811.836440] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 811.850857] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 811.935836] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 811.976156] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 811.976618] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 811.976899] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 811.977195] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 811.977437] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 811.977678] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 811.978011] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 811.979182] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 811.979182] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 811.979182] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 811.979182] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 811.979994] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c8656b2-c498-439c-8d09-dc3cde522886 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.992513] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5053528-ebde-4a7c-9535-b3aa7069fdea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.045220] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64e32e06-b296-4549-b570-999218fc2cf3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.054664] env[61439]: DEBUG nova.policy [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '919a60d5a73a4077b510084cc28a9499', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '278860d9a5104cebabf418408d8558d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 812.058294] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-153d5561-8521-4484-b83d-6e29dba11249 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.096141] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-238d02e7-2959-44a8-bb37-2ba6b2a0961b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.104467] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29740cbe-ae87-4b87-982a-0f198f8d4173 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.117634] env[61439]: DEBUG nova.compute.provider_tree [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 812.132040] env[61439]: DEBUG nova.scheduler.client.report [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 812.149865] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.150322] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 812.152773] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.541s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.154186] env[61439]: INFO nova.compute.claims [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 812.207025] env[61439]: DEBUG nova.compute.utils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 812.210474] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 812.210924] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 812.224964] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 812.320261] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 812.355674] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 812.355926] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 812.356717] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 812.356993] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 812.357177] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 812.357333] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 812.362905] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 812.362905] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 812.362905] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 812.362905] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 812.362905] env[61439]: DEBUG nova.virt.hardware [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 812.363202] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0135fc7-3105-4767-bcc5-60178f988198 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.376418] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8342a08c-12c8-4743-9d79-af60b5df7d97 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.405221] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "b779ccfb-9278-4a00-aadf-bd4afb0ab54a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 812.405467] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "b779ccfb-9278-4a00-aadf-bd4afb0ab54a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.413794] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Successfully created port: b63cbd7f-55d4-4445-bc1f-668af5b37db5 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 812.457016] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e496edf0-c0b1-40b2-9e1e-9c200314d426 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.462387] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb4b5d73-7852-4190-97f7-8ea3810464e3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.494902] env[61439]: DEBUG nova.policy [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '919a60d5a73a4077b510084cc28a9499', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '278860d9a5104cebabf418408d8558d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 812.496881] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db470b05-6c56-45ea-85dc-e60e47776ea1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.504838] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-458dea09-f171-4223-ac95-2ec19b88b25d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.519370] env[61439]: DEBUG nova.compute.provider_tree [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 812.530410] env[61439]: DEBUG nova.scheduler.client.report [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 812.556810] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.404s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.557384] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 812.610587] env[61439]: DEBUG nova.compute.utils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 812.611881] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 812.612068] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 812.623083] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 812.708446] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 812.750828] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 812.750828] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 812.750828] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 812.750992] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 812.750992] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 812.750992] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 812.750992] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 812.750992] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 812.751197] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 812.751197] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 812.751197] env[61439]: DEBUG nova.virt.hardware [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 812.751830] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cf16615-6ea8-447f-bfc6-7ba9859509e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.760123] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7b21035-98dd-40cd-91b3-d52c53c1d5ff {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.155497] env[61439]: DEBUG nova.policy [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eac14fe0d2c5455cb602e436721b2054', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c3bb640c7d6845c3ae230cb286d1a8ba', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 813.747427] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Successfully created port: 650ee4d0-5f8e-44be-b3d7-48701ce29ed7 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 814.021106] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Successfully created port: 72f3e9d6-cf65-454e-b7a9-2f0d35c86416 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 814.286643] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Successfully created port: 410abb97-ddc2-44bc-8edb-f7e053fb7273 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 815.315427] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "50266004-15d2-46cf-9f48-315f24831d24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 815.315722] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "50266004-15d2-46cf-9f48-315f24831d24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 815.516903] env[61439]: ERROR nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. [ 815.516903] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 815.516903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 815.516903] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 815.516903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 815.516903] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 815.516903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 815.516903] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 815.516903] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 815.516903] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 815.516903] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 815.516903] env[61439]: ERROR nova.compute.manager raise self.value [ 815.516903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 815.516903] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 815.516903] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 815.516903] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 815.517447] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 815.517447] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 815.517447] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. [ 815.517447] env[61439]: ERROR nova.compute.manager [ 815.517447] env[61439]: Traceback (most recent call last): [ 815.517447] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 815.517447] env[61439]: listener.cb(fileno) [ 815.517447] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 815.517447] env[61439]: result = function(*args, **kwargs) [ 815.517447] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 815.517447] env[61439]: return func(*args, **kwargs) [ 815.517447] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 815.517447] env[61439]: raise e [ 815.517447] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 815.517447] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 815.517447] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 815.517447] env[61439]: created_port_ids = self._update_ports_for_instance( [ 815.517447] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 815.517447] env[61439]: with excutils.save_and_reraise_exception(): [ 815.517447] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 815.517447] env[61439]: self.force_reraise() [ 815.517447] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 815.517447] env[61439]: raise self.value [ 815.517447] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 815.517447] env[61439]: updated_port = self._update_port( [ 815.517447] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 815.517447] env[61439]: _ensure_no_port_binding_failure(port) [ 815.517447] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 815.517447] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 815.518249] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. [ 815.518249] env[61439]: Removing descriptor: 23 [ 815.518249] env[61439]: ERROR nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Traceback (most recent call last): [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] yield resources [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self.driver.spawn(context, instance, image_meta, [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 815.518249] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] vm_ref = self.build_virtual_machine(instance, [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] vif_infos = vmwarevif.get_vif_info(self._session, [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] for vif in network_info: [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] return self._sync_wrapper(fn, *args, **kwargs) [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self.wait() [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self[:] = self._gt.wait() [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] return self._exit_event.wait() [ 815.518622] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] result = hub.switch() [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] return self.greenlet.switch() [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] result = function(*args, **kwargs) [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] return func(*args, **kwargs) [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] raise e [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] nwinfo = self.network_api.allocate_for_instance( [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 815.519049] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] created_port_ids = self._update_ports_for_instance( [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] with excutils.save_and_reraise_exception(): [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self.force_reraise() [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] raise self.value [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] updated_port = self._update_port( [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] _ensure_no_port_binding_failure(port) [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 815.519410] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] raise exception.PortBindingFailed(port_id=port['id']) [ 815.519763] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] nova.exception.PortBindingFailed: Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. [ 815.519763] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] [ 815.519763] env[61439]: INFO nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Terminating instance [ 815.521429] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Acquiring lock "refresh_cache-c30b7561-2c76-4d68-93da-f73e7c6a0ed7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 815.521429] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Acquired lock "refresh_cache-c30b7561-2c76-4d68-93da-f73e7c6a0ed7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 815.521429] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 815.572524] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Successfully created port: def2fbbb-a0c8-4947-8402-02a1fc787cff {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 815.759855] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 816.134644] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 816.145146] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Releasing lock "refresh_cache-c30b7561-2c76-4d68-93da-f73e7c6a0ed7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 816.145580] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 816.145775] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 816.146511] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fa396d3f-7a70-4b65-9bfc-bdde697f5d36 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.157286] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e957715-2808-4ee8-916f-2903c52003d6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.183012] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c30b7561-2c76-4d68-93da-f73e7c6a0ed7 could not be found. [ 816.183325] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 816.183544] env[61439]: INFO nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 816.183954] env[61439]: DEBUG oslo.service.loopingcall [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 816.184258] env[61439]: DEBUG nova.compute.manager [-] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 816.184420] env[61439]: DEBUG nova.network.neutron [-] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 816.239471] env[61439]: DEBUG nova.network.neutron [-] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 816.248316] env[61439]: DEBUG nova.network.neutron [-] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 816.258417] env[61439]: INFO nova.compute.manager [-] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Took 0.07 seconds to deallocate network for instance. [ 816.261338] env[61439]: DEBUG nova.compute.claims [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 816.261338] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 816.261338] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 816.484694] env[61439]: ERROR nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. [ 816.484694] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 816.484694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 816.484694] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 816.484694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 816.484694] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 816.484694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 816.484694] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 816.484694] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 816.484694] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 816.484694] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 816.484694] env[61439]: ERROR nova.compute.manager raise self.value [ 816.484694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 816.484694] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 816.484694] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 816.484694] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 816.485354] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 816.485354] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 816.485354] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. [ 816.485354] env[61439]: ERROR nova.compute.manager [ 816.485354] env[61439]: Traceback (most recent call last): [ 816.485354] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 816.485354] env[61439]: listener.cb(fileno) [ 816.485354] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 816.485354] env[61439]: result = function(*args, **kwargs) [ 816.485354] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 816.485354] env[61439]: return func(*args, **kwargs) [ 816.485354] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 816.485354] env[61439]: raise e [ 816.485354] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 816.485354] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 816.485354] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 816.485354] env[61439]: created_port_ids = self._update_ports_for_instance( [ 816.485354] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 816.485354] env[61439]: with excutils.save_and_reraise_exception(): [ 816.485354] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 816.485354] env[61439]: self.force_reraise() [ 816.485354] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 816.485354] env[61439]: raise self.value [ 816.485354] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 816.485354] env[61439]: updated_port = self._update_port( [ 816.485354] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 816.485354] env[61439]: _ensure_no_port_binding_failure(port) [ 816.485354] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 816.485354] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 816.486567] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. [ 816.486567] env[61439]: Removing descriptor: 20 [ 816.486567] env[61439]: ERROR nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Traceback (most recent call last): [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] yield resources [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self.driver.spawn(context, instance, image_meta, [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 816.486567] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] vm_ref = self.build_virtual_machine(instance, [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] vif_infos = vmwarevif.get_vif_info(self._session, [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] for vif in network_info: [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] return self._sync_wrapper(fn, *args, **kwargs) [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self.wait() [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self[:] = self._gt.wait() [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] return self._exit_event.wait() [ 816.486905] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] result = hub.switch() [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] return self.greenlet.switch() [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] result = function(*args, **kwargs) [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] return func(*args, **kwargs) [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] raise e [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] nwinfo = self.network_api.allocate_for_instance( [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 816.487259] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] created_port_ids = self._update_ports_for_instance( [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] with excutils.save_and_reraise_exception(): [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self.force_reraise() [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] raise self.value [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] updated_port = self._update_port( [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] _ensure_no_port_binding_failure(port) [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 816.487642] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] raise exception.PortBindingFailed(port_id=port['id']) [ 816.488212] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] nova.exception.PortBindingFailed: Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. [ 816.488212] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] [ 816.488212] env[61439]: INFO nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Terminating instance [ 816.489832] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Acquiring lock "refresh_cache-025992d2-423e-492d-b820-7d9f3554c0f6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 816.489832] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Acquired lock "refresh_cache-025992d2-423e-492d-b820-7d9f3554c0f6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 816.489969] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 816.495188] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-818b002e-53a2-44ac-ad19-8911e8ef544a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.509386] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffb8988f-7890-429d-b982-08923aafe02f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.548150] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ad04d05-b31f-4056-ba82-31e2915e20e5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.556896] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07b8014d-9b3a-4c99-8d5a-5b257bc57a7a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.562553] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 816.576316] env[61439]: DEBUG nova.compute.provider_tree [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 816.590426] env[61439]: DEBUG nova.scheduler.client.report [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 816.614901] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.352s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 816.614901] env[61439]: ERROR nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. [ 816.614901] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Traceback (most recent call last): [ 816.614901] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 816.614901] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self.driver.spawn(context, instance, image_meta, [ 816.614901] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 816.614901] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 816.614901] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 816.614901] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] vm_ref = self.build_virtual_machine(instance, [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] vif_infos = vmwarevif.get_vif_info(self._session, [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] for vif in network_info: [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] return self._sync_wrapper(fn, *args, **kwargs) [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self.wait() [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self[:] = self._gt.wait() [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] return self._exit_event.wait() [ 816.615290] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] result = hub.switch() [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] return self.greenlet.switch() [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] result = function(*args, **kwargs) [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] return func(*args, **kwargs) [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] raise e [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] nwinfo = self.network_api.allocate_for_instance( [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 816.615631] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] created_port_ids = self._update_ports_for_instance( [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] with excutils.save_and_reraise_exception(): [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] self.force_reraise() [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] raise self.value [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] updated_port = self._update_port( [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] _ensure_no_port_binding_failure(port) [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 816.615958] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] raise exception.PortBindingFailed(port_id=port['id']) [ 816.616553] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] nova.exception.PortBindingFailed: Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. [ 816.616553] env[61439]: ERROR nova.compute.manager [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] [ 816.616553] env[61439]: DEBUG nova.compute.utils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 816.618171] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Build of instance c30b7561-2c76-4d68-93da-f73e7c6a0ed7 was re-scheduled: Binding failed for port 81aad203-f976-4c1c-81d7-6848338446d7, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 816.618171] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 816.618531] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Acquiring lock "refresh_cache-c30b7561-2c76-4d68-93da-f73e7c6a0ed7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 816.618751] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Acquired lock "refresh_cache-c30b7561-2c76-4d68-93da-f73e7c6a0ed7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 816.618988] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 816.716057] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 817.227081] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 817.239375] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Releasing lock "refresh_cache-025992d2-423e-492d-b820-7d9f3554c0f6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 817.239782] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 817.239962] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 817.240593] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b6bcaf63-9384-4472-b0c3-53feda173332 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.250860] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c09a4c7-1df4-4d0a-8648-9d3c3132d525 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.277370] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 025992d2-423e-492d-b820-7d9f3554c0f6 could not be found. [ 817.277613] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 817.278114] env[61439]: INFO nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 817.278433] env[61439]: DEBUG oslo.service.loopingcall [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 817.278673] env[61439]: DEBUG nova.compute.manager [-] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 817.278768] env[61439]: DEBUG nova.network.neutron [-] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 817.323123] env[61439]: DEBUG nova.network.neutron [-] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 817.333197] env[61439]: DEBUG nova.network.neutron [-] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 817.341449] env[61439]: INFO nova.compute.manager [-] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Took 0.06 seconds to deallocate network for instance. [ 817.343579] env[61439]: DEBUG nova.compute.claims [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 817.343757] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.343971] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 817.374380] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 817.387784] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Releasing lock "refresh_cache-c30b7561-2c76-4d68-93da-f73e7c6a0ed7" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 817.387784] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 817.387784] env[61439]: DEBUG nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 817.387784] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 817.426789] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 817.435894] env[61439]: DEBUG nova.network.neutron [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 817.446890] env[61439]: INFO nova.compute.manager [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] [instance: c30b7561-2c76-4d68-93da-f73e7c6a0ed7] Took 0.06 seconds to deallocate network for instance. [ 817.558064] env[61439]: INFO nova.scheduler.client.report [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Deleted allocations for instance c30b7561-2c76-4d68-93da-f73e7c6a0ed7 [ 817.580216] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4887a2ec-4ebb-4d88-930a-8c9f780af24e tempest-FloatingIPsAssociationTestJSON-1026406162 tempest-FloatingIPsAssociationTestJSON-1026406162-project-member] Lock "c30b7561-2c76-4d68-93da-f73e7c6a0ed7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.266s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 817.593443] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-328ee643-1854-4b39-bfe3-a1fe0a1ad50b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.604819] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6e7be00-797e-431d-88d6-6ceccabf4f11 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.608548] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 817.648930] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cee5472-5b7a-4abc-a5a6-4ac1a5e39a01 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.663552] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3e77ed0-ba4c-4cd0-b52f-fa928f8a7eca {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.682757] env[61439]: DEBUG nova.compute.provider_tree [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 817.684946] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.693872] env[61439]: DEBUG nova.scheduler.client.report [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 817.714057] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.370s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 817.715075] env[61439]: ERROR nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Traceback (most recent call last): [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self.driver.spawn(context, instance, image_meta, [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] vm_ref = self.build_virtual_machine(instance, [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] vif_infos = vmwarevif.get_vif_info(self._session, [ 817.715075] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] for vif in network_info: [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] return self._sync_wrapper(fn, *args, **kwargs) [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self.wait() [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self[:] = self._gt.wait() [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] return self._exit_event.wait() [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] result = hub.switch() [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 817.715569] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] return self.greenlet.switch() [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] result = function(*args, **kwargs) [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] return func(*args, **kwargs) [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] raise e [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] nwinfo = self.network_api.allocate_for_instance( [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] created_port_ids = self._update_ports_for_instance( [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] with excutils.save_and_reraise_exception(): [ 817.716458] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] self.force_reraise() [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] raise self.value [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] updated_port = self._update_port( [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] _ensure_no_port_binding_failure(port) [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] raise exception.PortBindingFailed(port_id=port['id']) [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] nova.exception.PortBindingFailed: Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. [ 817.716866] env[61439]: ERROR nova.compute.manager [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] [ 817.717210] env[61439]: DEBUG nova.compute.utils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 817.717210] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.032s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 817.718769] env[61439]: INFO nova.compute.claims [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 817.721802] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Build of instance 025992d2-423e-492d-b820-7d9f3554c0f6 was re-scheduled: Binding failed for port 5a95018b-84db-45e5-8b22-d05ea50159a5, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 817.722298] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 817.722529] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Acquiring lock "refresh_cache-025992d2-423e-492d-b820-7d9f3554c0f6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 817.722680] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Acquired lock "refresh_cache-025992d2-423e-492d-b820-7d9f3554c0f6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 817.723111] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 817.763912] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 817.925469] env[61439]: ERROR nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. [ 817.925469] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 817.925469] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 817.925469] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 817.925469] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 817.925469] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 817.925469] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 817.925469] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 817.925469] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 817.925469] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 817.925469] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 817.925469] env[61439]: ERROR nova.compute.manager raise self.value [ 817.925469] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 817.925469] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 817.925469] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 817.925469] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 817.926742] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 817.926742] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 817.926742] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. [ 817.926742] env[61439]: ERROR nova.compute.manager [ 817.926742] env[61439]: Traceback (most recent call last): [ 817.926742] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 817.926742] env[61439]: listener.cb(fileno) [ 817.926742] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 817.926742] env[61439]: result = function(*args, **kwargs) [ 817.926742] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 817.926742] env[61439]: return func(*args, **kwargs) [ 817.926742] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 817.926742] env[61439]: raise e [ 817.926742] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 817.926742] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 817.926742] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 817.926742] env[61439]: created_port_ids = self._update_ports_for_instance( [ 817.926742] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 817.926742] env[61439]: with excutils.save_and_reraise_exception(): [ 817.926742] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 817.926742] env[61439]: self.force_reraise() [ 817.926742] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 817.926742] env[61439]: raise self.value [ 817.926742] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 817.926742] env[61439]: updated_port = self._update_port( [ 817.926742] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 817.926742] env[61439]: _ensure_no_port_binding_failure(port) [ 817.926742] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 817.926742] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 817.928380] env[61439]: nova.exception.PortBindingFailed: Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. [ 817.928380] env[61439]: Removing descriptor: 21 [ 817.928380] env[61439]: ERROR nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Traceback (most recent call last): [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] yield resources [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self.driver.spawn(context, instance, image_meta, [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self._vmops.spawn(context, instance, image_meta, injected_files, [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 817.928380] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] vm_ref = self.build_virtual_machine(instance, [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] vif_infos = vmwarevif.get_vif_info(self._session, [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] for vif in network_info: [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] return self._sync_wrapper(fn, *args, **kwargs) [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self.wait() [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self[:] = self._gt.wait() [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] return self._exit_event.wait() [ 817.928714] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] result = hub.switch() [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] return self.greenlet.switch() [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] result = function(*args, **kwargs) [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] return func(*args, **kwargs) [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] raise e [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] nwinfo = self.network_api.allocate_for_instance( [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 817.929070] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] created_port_ids = self._update_ports_for_instance( [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] with excutils.save_and_reraise_exception(): [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self.force_reraise() [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] raise self.value [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] updated_port = self._update_port( [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] _ensure_no_port_binding_failure(port) [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 817.929442] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] raise exception.PortBindingFailed(port_id=port['id']) [ 817.929763] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] nova.exception.PortBindingFailed: Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. [ 817.929763] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] [ 817.929763] env[61439]: INFO nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Terminating instance [ 817.929763] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Acquiring lock "refresh_cache-de136287-7dfb-4829-aef7-96ae82acfa65" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 817.929763] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Acquired lock "refresh_cache-de136287-7dfb-4829-aef7-96ae82acfa65" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 817.929763] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 817.986934] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc2c03fd-9ed3-4b24-9551-48b9fe365135 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.994564] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 818.001188] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3237708d-61dd-4d39-a86c-d024a913185b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.038859] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21102411-38ed-446c-a70c-b34f4bcc0fdc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.048036] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b00ec948-2172-4079-b30c-c5483ca1b43a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.063835] env[61439]: DEBUG nova.compute.provider_tree [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 818.077937] env[61439]: DEBUG nova.scheduler.client.report [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 818.093714] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 818.094410] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 818.136030] env[61439]: DEBUG nova.compute.utils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 818.136573] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 818.140023] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 818.146467] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 818.231418] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 818.258063] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 818.258311] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 818.258516] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 818.258738] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 818.258891] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 818.259197] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 818.259453] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 818.259621] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 818.259790] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 818.259952] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 818.260140] env[61439]: DEBUG nova.virt.hardware [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 818.260986] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-056d9b14-8f86-4aa0-aa76-e9f556638f48 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.270237] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e61a3af6-4c2e-4df4-8c61-08a78fc57f67 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.427265] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 818.468880] env[61439]: DEBUG nova.policy [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c468b82bd9d64e19b419a393fff4af06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f86a86563cc047459d3e7c0553c82c63', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 818.479329] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Releasing lock "refresh_cache-025992d2-423e-492d-b820-7d9f3554c0f6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 818.479329] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 818.479329] env[61439]: DEBUG nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 818.479329] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 818.552348] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 818.568996] env[61439]: DEBUG nova.network.neutron [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 818.585541] env[61439]: INFO nova.compute.manager [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] [instance: 025992d2-423e-492d-b820-7d9f3554c0f6] Took 0.11 seconds to deallocate network for instance. [ 818.678782] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 818.695849] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Releasing lock "refresh_cache-de136287-7dfb-4829-aef7-96ae82acfa65" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 818.696452] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 818.696726] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 818.697758] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-58184af6-3aba-4ac8-ba67-90951bce9257 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.711142] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d26e417-96ac-4717-a451-03feaf0ae60f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.726057] env[61439]: INFO nova.scheduler.client.report [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Deleted allocations for instance 025992d2-423e-492d-b820-7d9f3554c0f6 [ 818.751153] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance de136287-7dfb-4829-aef7-96ae82acfa65 could not be found. [ 818.751290] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 818.751522] env[61439]: INFO nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Took 0.05 seconds to destroy the instance on the hypervisor. [ 818.751714] env[61439]: DEBUG oslo.service.loopingcall [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 818.754278] env[61439]: DEBUG oslo_concurrency.lockutils [None req-1c57db5c-9254-46dc-9e36-e447e34ffafc tempest-AttachInterfacesUnderV243Test-2070080807 tempest-AttachInterfacesUnderV243Test-2070080807-project-member] Lock "025992d2-423e-492d-b820-7d9f3554c0f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.582s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 818.754278] env[61439]: DEBUG nova.compute.manager [-] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 818.754278] env[61439]: DEBUG nova.network.neutron [-] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 818.773476] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 818.820163] env[61439]: DEBUG nova.network.neutron [-] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 818.833817] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 818.834030] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 818.835607] env[61439]: INFO nova.compute.claims [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 818.839300] env[61439]: DEBUG nova.network.neutron [-] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 818.850852] env[61439]: INFO nova.compute.manager [-] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Took 0.10 seconds to deallocate network for instance. [ 818.854067] env[61439]: DEBUG nova.compute.claims [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 818.854378] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 819.052033] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbfb2c71-4a61-4fb8-af16-e0473aa25a8f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.061775] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4b96363-651a-45b8-8ba2-d3f2df09479b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.094205] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edc42a91-7fa9-437b-bdfe-5acdc2a792c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.103940] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db3f628d-7228-4108-85fe-4d2cc5895f39 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.124384] env[61439]: DEBUG nova.compute.provider_tree [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 819.144669] env[61439]: DEBUG nova.scheduler.client.report [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 819.206475] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.207048] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 819.212127] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.357s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 819.284377] env[61439]: DEBUG nova.compute.utils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 819.286496] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 819.286678] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 819.299176] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 819.304414] env[61439]: ERROR nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. [ 819.304414] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 819.304414] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 819.304414] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 819.304414] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 819.304414] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 819.304414] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 819.304414] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 819.304414] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 819.304414] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 819.304414] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 819.304414] env[61439]: ERROR nova.compute.manager raise self.value [ 819.304414] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 819.304414] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 819.304414] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 819.304414] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 819.305100] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 819.305100] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 819.305100] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. [ 819.305100] env[61439]: ERROR nova.compute.manager [ 819.305100] env[61439]: Traceback (most recent call last): [ 819.305100] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 819.305100] env[61439]: listener.cb(fileno) [ 819.305100] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 819.305100] env[61439]: result = function(*args, **kwargs) [ 819.305100] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 819.305100] env[61439]: return func(*args, **kwargs) [ 819.305100] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 819.305100] env[61439]: raise e [ 819.305100] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 819.305100] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 819.305100] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 819.305100] env[61439]: created_port_ids = self._update_ports_for_instance( [ 819.305100] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 819.305100] env[61439]: with excutils.save_and_reraise_exception(): [ 819.305100] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 819.305100] env[61439]: self.force_reraise() [ 819.305100] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 819.305100] env[61439]: raise self.value [ 819.305100] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 819.305100] env[61439]: updated_port = self._update_port( [ 819.305100] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 819.305100] env[61439]: _ensure_no_port_binding_failure(port) [ 819.305100] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 819.305100] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 819.305890] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. [ 819.305890] env[61439]: Removing descriptor: 24 [ 819.305890] env[61439]: ERROR nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Traceback (most recent call last): [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] yield resources [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self.driver.spawn(context, instance, image_meta, [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 819.305890] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] vm_ref = self.build_virtual_machine(instance, [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] vif_infos = vmwarevif.get_vif_info(self._session, [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] for vif in network_info: [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] return self._sync_wrapper(fn, *args, **kwargs) [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self.wait() [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self[:] = self._gt.wait() [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] return self._exit_event.wait() [ 819.306259] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] result = hub.switch() [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] return self.greenlet.switch() [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] result = function(*args, **kwargs) [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] return func(*args, **kwargs) [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] raise e [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] nwinfo = self.network_api.allocate_for_instance( [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 819.308173] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] created_port_ids = self._update_ports_for_instance( [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] with excutils.save_and_reraise_exception(): [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self.force_reraise() [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] raise self.value [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] updated_port = self._update_port( [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] _ensure_no_port_binding_failure(port) [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 819.310764] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] raise exception.PortBindingFailed(port_id=port['id']) [ 819.311342] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] nova.exception.PortBindingFailed: Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. [ 819.311342] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] [ 819.311342] env[61439]: INFO nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Terminating instance [ 819.311342] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "refresh_cache-cfda19ad-e831-4018-9228-e96fede0bae6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 819.311342] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquired lock "refresh_cache-cfda19ad-e831-4018-9228-e96fede0bae6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 819.311342] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 819.382421] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 819.419644] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 819.419644] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 819.419644] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 819.420106] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 819.420106] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 819.420106] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 819.420106] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 819.420106] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 819.420352] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 819.420352] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 819.420495] env[61439]: DEBUG nova.virt.hardware [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 819.422048] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56b10522-62a6-4b74-9c2d-7a0081d25a4e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.433011] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c67714eb-3ec5-4705-8d6d-7a5381b64272 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.487070] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f64ebc4-9bad-4cf9-b788-0c99c604fd2a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.495469] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34e5bdb1-402d-41f3-896c-1b545acb5d3c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.529842] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11e13110-fa8c-4a06-b8b3-0649695a8935 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.536715] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf68080c-90a4-4306-b93a-b5128f8dbbfe {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.552838] env[61439]: DEBUG nova.compute.provider_tree [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 819.554731] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 819.563562] env[61439]: DEBUG nova.scheduler.client.report [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 819.584277] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.373s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.585511] env[61439]: ERROR nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Traceback (most recent call last): [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self.driver.spawn(context, instance, image_meta, [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self._vmops.spawn(context, instance, image_meta, injected_files, [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] vm_ref = self.build_virtual_machine(instance, [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] vif_infos = vmwarevif.get_vif_info(self._session, [ 819.585511] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] for vif in network_info: [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] return self._sync_wrapper(fn, *args, **kwargs) [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self.wait() [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self[:] = self._gt.wait() [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] return self._exit_event.wait() [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] result = hub.switch() [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 819.586034] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] return self.greenlet.switch() [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] result = function(*args, **kwargs) [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] return func(*args, **kwargs) [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] raise e [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] nwinfo = self.network_api.allocate_for_instance( [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] created_port_ids = self._update_ports_for_instance( [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] with excutils.save_and_reraise_exception(): [ 819.586435] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] self.force_reraise() [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] raise self.value [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] updated_port = self._update_port( [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] _ensure_no_port_binding_failure(port) [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] raise exception.PortBindingFailed(port_id=port['id']) [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] nova.exception.PortBindingFailed: Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. [ 819.586835] env[61439]: ERROR nova.compute.manager [instance: de136287-7dfb-4829-aef7-96ae82acfa65] [ 819.587181] env[61439]: DEBUG nova.compute.utils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 819.588181] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Build of instance de136287-7dfb-4829-aef7-96ae82acfa65 was re-scheduled: Binding failed for port b63cbd7f-55d4-4445-bc1f-668af5b37db5, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 819.588633] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 819.588862] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Acquiring lock "refresh_cache-de136287-7dfb-4829-aef7-96ae82acfa65" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 819.589033] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Acquired lock "refresh_cache-de136287-7dfb-4829-aef7-96ae82acfa65" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 819.589201] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 819.694478] env[61439]: DEBUG nova.policy [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '721ff5c63c5f405bb7be8c486b5fd162', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd841db9575854aa388acc0bbb499fd52', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 819.756363] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 819.909473] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Successfully created port: 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 820.089100] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 820.104250] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Releasing lock "refresh_cache-cfda19ad-e831-4018-9228-e96fede0bae6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 820.104546] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 820.104765] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 820.105296] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5fdd7417-14da-4562-8017-67921cce5b80 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.115254] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9e0d39c-7e14-4d7a-a6b7-0c2ff50e5200 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.143323] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cfda19ad-e831-4018-9228-e96fede0bae6 could not be found. [ 820.143653] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 820.143835] env[61439]: INFO nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 820.144098] env[61439]: DEBUG oslo.service.loopingcall [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 820.144340] env[61439]: DEBUG nova.compute.manager [-] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 820.144436] env[61439]: DEBUG nova.network.neutron [-] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 820.214546] env[61439]: DEBUG nova.network.neutron [-] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 820.228389] env[61439]: DEBUG nova.network.neutron [-] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 820.236939] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 820.239526] env[61439]: INFO nova.compute.manager [-] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Took 0.09 seconds to deallocate network for instance. [ 820.241997] env[61439]: DEBUG nova.compute.claims [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 820.242244] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 820.242423] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 820.251186] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Releasing lock "refresh_cache-de136287-7dfb-4829-aef7-96ae82acfa65" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 820.251506] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 820.252318] env[61439]: DEBUG nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 820.252318] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 820.297190] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 820.307435] env[61439]: DEBUG nova.network.neutron [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 820.317950] env[61439]: INFO nova.compute.manager [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] [instance: de136287-7dfb-4829-aef7-96ae82acfa65] Took 0.07 seconds to deallocate network for instance. [ 820.467333] env[61439]: INFO nova.scheduler.client.report [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Deleted allocations for instance de136287-7dfb-4829-aef7-96ae82acfa65 [ 820.493873] env[61439]: DEBUG oslo_concurrency.lockutils [None req-25dae54b-4644-4825-9a67-0cbe718eea52 tempest-ServersV294TestFqdnHostnames-1074575199 tempest-ServersV294TestFqdnHostnames-1074575199-project-member] Lock "de136287-7dfb-4829-aef7-96ae82acfa65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.005s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 820.502823] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8327869c-832f-48b8-9621-b0a2890e4fc4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.513519] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0376915a-47ae-4285-b7b7-f24a71e1b021 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.524453] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "da0610f3-ab6b-4496-ba18-2794869a2831" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 820.524453] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "da0610f3-ab6b-4496-ba18-2794869a2831" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 820.555127] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 820.558271] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62eac60e-778f-4651-9389-f7123d13e1d3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.567907] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4274d06b-5847-4a34-bf6c-62bd6f0d1005 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.583666] env[61439]: DEBUG nova.compute.provider_tree [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 820.593725] env[61439]: DEBUG nova.scheduler.client.report [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 820.630648] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.388s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 820.631470] env[61439]: ERROR nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Traceback (most recent call last): [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self.driver.spawn(context, instance, image_meta, [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] vm_ref = self.build_virtual_machine(instance, [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] vif_infos = vmwarevif.get_vif_info(self._session, [ 820.631470] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] for vif in network_info: [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] return self._sync_wrapper(fn, *args, **kwargs) [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self.wait() [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self[:] = self._gt.wait() [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] return self._exit_event.wait() [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] result = hub.switch() [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 820.631843] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] return self.greenlet.switch() [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] result = function(*args, **kwargs) [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] return func(*args, **kwargs) [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] raise e [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] nwinfo = self.network_api.allocate_for_instance( [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] created_port_ids = self._update_ports_for_instance( [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] with excutils.save_and_reraise_exception(): [ 820.632229] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] self.force_reraise() [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] raise self.value [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] updated_port = self._update_port( [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] _ensure_no_port_binding_failure(port) [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] raise exception.PortBindingFailed(port_id=port['id']) [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] nova.exception.PortBindingFailed: Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. [ 820.632666] env[61439]: ERROR nova.compute.manager [instance: cfda19ad-e831-4018-9228-e96fede0bae6] [ 820.633071] env[61439]: DEBUG nova.compute.utils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 820.636180] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Build of instance cfda19ad-e831-4018-9228-e96fede0bae6 was re-scheduled: Binding failed for port 72f3e9d6-cf65-454e-b7a9-2f0d35c86416, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 820.636180] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 820.636180] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "refresh_cache-cfda19ad-e831-4018-9228-e96fede0bae6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 820.636180] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquired lock "refresh_cache-cfda19ad-e831-4018-9228-e96fede0bae6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 820.636635] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 820.641304] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 820.641534] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 820.642985] env[61439]: INFO nova.compute.claims [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 820.812534] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 820.864468] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a25cdece-ef4f-472d-a88e-8908e4f0fd41 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.872530] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-874b20fe-a238-472c-ae46-4f394e02d6bd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.905721] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27eab1bf-39a5-4bcc-a1d3-c998a4f810a0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.913508] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-483d4557-a6e7-40cc-ab33-739cfc138601 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.927184] env[61439]: DEBUG nova.compute.provider_tree [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 820.939069] env[61439]: DEBUG nova.scheduler.client.report [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 820.954652] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 820.955176] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 820.997653] env[61439]: DEBUG nova.compute.utils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 820.999191] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 820.999281] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 821.013031] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 821.091698] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 821.124662] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 821.124911] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 821.125175] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 821.125438] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 821.125596] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 821.125739] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 821.125987] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 821.126164] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 821.126334] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 821.126606] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 821.126803] env[61439]: DEBUG nova.virt.hardware [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 821.132308] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98d60772-abd8-4281-b496-5703928e29ed {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 821.138213] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6379d87-e6fe-4678-af15-958942808e16 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 821.411256] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 821.412536] env[61439]: DEBUG nova.policy [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af2fd8431af45ca891f744f4d10b54f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca364a2df93a424f8b66ee39d9b0b120', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 821.420365] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Releasing lock "refresh_cache-cfda19ad-e831-4018-9228-e96fede0bae6" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 821.420740] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 821.421337] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 821.421629] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 821.511627] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 821.525299] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 821.534365] env[61439]: INFO nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: cfda19ad-e831-4018-9228-e96fede0bae6] Took 0.11 seconds to deallocate network for instance. [ 821.584874] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Successfully created port: 7284a82c-18a5-43a0-b4d9-67251e9955b7 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 821.669113] env[61439]: INFO nova.scheduler.client.report [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Deleted allocations for instance cfda19ad-e831-4018-9228-e96fede0bae6 [ 821.695472] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "cfda19ad-e831-4018-9228-e96fede0bae6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.684s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 822.984635] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "7c3017dc-afd5-438e-ae23-ca0d7d4c01af" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.985574] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "7c3017dc-afd5-438e-ae23-ca0d7d4c01af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 822.998345] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 823.072949] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 823.073223] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 823.075060] env[61439]: INFO nova.compute.claims [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 823.137881] env[61439]: ERROR nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. [ 823.137881] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 823.137881] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 823.137881] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 823.137881] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 823.137881] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 823.137881] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 823.137881] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 823.137881] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 823.137881] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 823.137881] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 823.137881] env[61439]: ERROR nova.compute.manager raise self.value [ 823.137881] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 823.137881] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 823.137881] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 823.137881] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 823.138418] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 823.138418] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 823.138418] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. [ 823.138418] env[61439]: ERROR nova.compute.manager [ 823.139180] env[61439]: Traceback (most recent call last): [ 823.139251] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 823.139251] env[61439]: listener.cb(fileno) [ 823.139251] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 823.139251] env[61439]: result = function(*args, **kwargs) [ 823.139251] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 823.139251] env[61439]: return func(*args, **kwargs) [ 823.139251] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 823.139251] env[61439]: raise e [ 823.139251] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 823.139251] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 823.139552] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 823.139552] env[61439]: created_port_ids = self._update_ports_for_instance( [ 823.139552] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 823.139552] env[61439]: with excutils.save_and_reraise_exception(): [ 823.139552] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 823.139552] env[61439]: self.force_reraise() [ 823.139552] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 823.139552] env[61439]: raise self.value [ 823.139552] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 823.139552] env[61439]: updated_port = self._update_port( [ 823.139552] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 823.139552] env[61439]: _ensure_no_port_binding_failure(port) [ 823.139552] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 823.139552] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 823.139552] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. [ 823.139552] env[61439]: Removing descriptor: 10 [ 823.140695] env[61439]: ERROR nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Traceback (most recent call last): [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] yield resources [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self.driver.spawn(context, instance, image_meta, [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] vm_ref = self.build_virtual_machine(instance, [ 823.140695] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] vif_infos = vmwarevif.get_vif_info(self._session, [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] for vif in network_info: [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] return self._sync_wrapper(fn, *args, **kwargs) [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self.wait() [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self[:] = self._gt.wait() [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] return self._exit_event.wait() [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 823.141087] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] result = hub.switch() [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] return self.greenlet.switch() [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] result = function(*args, **kwargs) [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] return func(*args, **kwargs) [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] raise e [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] nwinfo = self.network_api.allocate_for_instance( [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] created_port_ids = self._update_ports_for_instance( [ 823.141480] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] with excutils.save_and_reraise_exception(): [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self.force_reraise() [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] raise self.value [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] updated_port = self._update_port( [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] _ensure_no_port_binding_failure(port) [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] raise exception.PortBindingFailed(port_id=port['id']) [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] nova.exception.PortBindingFailed: Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. [ 823.141870] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] [ 823.142333] env[61439]: INFO nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Terminating instance [ 823.144490] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Acquiring lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 823.144718] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Acquired lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 823.144939] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 823.238277] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 823.307892] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7265745a-2b07-412e-b508-f53c41a1eaaf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.313794] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Successfully created port: 9c39d9bd-de1a-462d-9218-e51ebfb16c02 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 823.322500] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27ca6d1e-f348-4b46-8ed3-da2b8489fb00 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.365032] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef910bfa-aa10-43f1-ac63-ad4717c0341f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.374935] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83ebe0b1-9efa-4c21-a300-1f396bdef27e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.390172] env[61439]: DEBUG nova.compute.provider_tree [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 823.402752] env[61439]: DEBUG nova.scheduler.client.report [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 823.421429] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 823.421854] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 823.469431] env[61439]: DEBUG nova.compute.utils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 823.469965] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 823.474032] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 823.489218] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 823.596967] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 823.621996] env[61439]: ERROR nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. [ 823.621996] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 823.621996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 823.621996] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 823.621996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 823.621996] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 823.621996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 823.621996] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 823.621996] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 823.621996] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 823.621996] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 823.621996] env[61439]: ERROR nova.compute.manager raise self.value [ 823.621996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 823.621996] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 823.621996] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 823.621996] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 823.622524] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 823.622524] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 823.622524] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. [ 823.622524] env[61439]: ERROR nova.compute.manager [ 823.622524] env[61439]: Traceback (most recent call last): [ 823.622524] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 823.622524] env[61439]: listener.cb(fileno) [ 823.622524] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 823.622524] env[61439]: result = function(*args, **kwargs) [ 823.622524] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 823.622524] env[61439]: return func(*args, **kwargs) [ 823.622524] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 823.622524] env[61439]: raise e [ 823.622524] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 823.622524] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 823.622524] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 823.622524] env[61439]: created_port_ids = self._update_ports_for_instance( [ 823.622524] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 823.622524] env[61439]: with excutils.save_and_reraise_exception(): [ 823.622524] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 823.622524] env[61439]: self.force_reraise() [ 823.622524] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 823.622524] env[61439]: raise self.value [ 823.622524] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 823.622524] env[61439]: updated_port = self._update_port( [ 823.622524] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 823.622524] env[61439]: _ensure_no_port_binding_failure(port) [ 823.622524] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 823.622524] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 823.623325] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. [ 823.623325] env[61439]: Removing descriptor: 18 [ 823.623325] env[61439]: ERROR nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Traceback (most recent call last): [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] yield resources [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self.driver.spawn(context, instance, image_meta, [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self._vmops.spawn(context, instance, image_meta, injected_files, [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 823.623325] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] vm_ref = self.build_virtual_machine(instance, [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] vif_infos = vmwarevif.get_vif_info(self._session, [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] for vif in network_info: [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] return self._sync_wrapper(fn, *args, **kwargs) [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self.wait() [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self[:] = self._gt.wait() [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] return self._exit_event.wait() [ 823.624455] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] result = hub.switch() [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] return self.greenlet.switch() [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] result = function(*args, **kwargs) [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] return func(*args, **kwargs) [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] raise e [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] nwinfo = self.network_api.allocate_for_instance( [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 823.624916] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] created_port_ids = self._update_ports_for_instance( [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] with excutils.save_and_reraise_exception(): [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self.force_reraise() [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] raise self.value [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] updated_port = self._update_port( [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] _ensure_no_port_binding_failure(port) [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 823.626720] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] raise exception.PortBindingFailed(port_id=port['id']) [ 823.627328] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] nova.exception.PortBindingFailed: Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. [ 823.627328] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] [ 823.627328] env[61439]: INFO nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Terminating instance [ 823.628554] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "refresh_cache-eec2b74f-9b6c-4566-a0dc-da1fe9578715" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 823.628719] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquired lock "refresh_cache-eec2b74f-9b6c-4566-a0dc-da1fe9578715" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 823.628874] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 823.633978] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 823.634206] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 823.634365] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 823.634681] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 823.634905] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 823.635489] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 823.635745] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 823.635917] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 823.636105] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 823.636375] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 823.636455] env[61439]: DEBUG nova.virt.hardware [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 823.640857] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0805ac4f-a2ca-41d8-8a00-28f92cc65cfc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.648957] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b03a59ab-748f-451d-ac59-a898d1f85d6f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.758707] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 823.847470] env[61439]: DEBUG nova.policy [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b861ada4972f4431b0b9bd46ae21f7cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16074166244d449b99488fc24f4f3d74', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 823.988544] env[61439]: DEBUG nova.compute.manager [req-6a8f047e-d42f-423e-b1de-0fbce188c377 req-d079442f-a5ef-455d-8fb5-0b30e2f7b94a service nova] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Received event network-changed-650ee4d0-5f8e-44be-b3d7-48701ce29ed7 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 823.988887] env[61439]: DEBUG nova.compute.manager [req-6a8f047e-d42f-423e-b1de-0fbce188c377 req-d079442f-a5ef-455d-8fb5-0b30e2f7b94a service nova] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Refreshing instance network info cache due to event network-changed-650ee4d0-5f8e-44be-b3d7-48701ce29ed7. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 823.989474] env[61439]: DEBUG oslo_concurrency.lockutils [req-6a8f047e-d42f-423e-b1de-0fbce188c377 req-d079442f-a5ef-455d-8fb5-0b30e2f7b94a service nova] Acquiring lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 824.161436] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.175340] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Releasing lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 824.176908] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 824.177136] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 824.177474] env[61439]: DEBUG oslo_concurrency.lockutils [req-6a8f047e-d42f-423e-b1de-0fbce188c377 req-d079442f-a5ef-455d-8fb5-0b30e2f7b94a service nova] Acquired lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 824.177718] env[61439]: DEBUG nova.network.neutron [req-6a8f047e-d42f-423e-b1de-0fbce188c377 req-d079442f-a5ef-455d-8fb5-0b30e2f7b94a service nova] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Refreshing network info cache for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 824.181995] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b9257af1-d7c5-4e47-a9fd-8a4616df9def {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.192147] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd707df9-7f68-4e13-b796-2acaec3806ae {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.220361] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 282ecbf4-cd05-4ea0-bb3f-708969856b7e could not be found. [ 824.220672] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 824.220837] env[61439]: INFO nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 824.221104] env[61439]: DEBUG oslo.service.loopingcall [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 824.221357] env[61439]: DEBUG nova.compute.manager [-] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 824.221454] env[61439]: DEBUG nova.network.neutron [-] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 824.277174] env[61439]: DEBUG nova.network.neutron [req-6a8f047e-d42f-423e-b1de-0fbce188c377 req-d079442f-a5ef-455d-8fb5-0b30e2f7b94a service nova] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 824.370578] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.380125] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Releasing lock "refresh_cache-eec2b74f-9b6c-4566-a0dc-da1fe9578715" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 824.380548] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 824.380740] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 824.381619] env[61439]: DEBUG nova.network.neutron [-] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 824.383009] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-408546dd-7333-4704-82a2-ae67622307b5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.393450] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-254a49ea-39fd-4a2d-b56c-d2c259bf1c07 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.405032] env[61439]: DEBUG nova.network.neutron [-] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.413195] env[61439]: INFO nova.compute.manager [-] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Took 0.19 seconds to deallocate network for instance. [ 824.420854] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance eec2b74f-9b6c-4566-a0dc-da1fe9578715 could not be found. [ 824.421332] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 824.421540] env[61439]: INFO nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Took 0.04 seconds to destroy the instance on the hypervisor. [ 824.421780] env[61439]: DEBUG oslo.service.loopingcall [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 824.422249] env[61439]: DEBUG nova.compute.claims [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 824.422475] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 824.422697] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 824.425704] env[61439]: DEBUG nova.compute.manager [-] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 824.425804] env[61439]: DEBUG nova.network.neutron [-] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 824.576437] env[61439]: DEBUG nova.network.neutron [-] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 824.589416] env[61439]: DEBUG nova.network.neutron [-] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.603312] env[61439]: INFO nova.compute.manager [-] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Took 0.18 seconds to deallocate network for instance. [ 824.605603] env[61439]: DEBUG nova.compute.claims [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 824.605693] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 824.653135] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1dc7ed7-42db-49e2-82a1-e9617c0c60e2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.661399] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09350b96-fb82-4cef-bc23-9adc4a4b9726 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.701402] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94146d0f-53bf-4a07-a1e7-d70644de9e75 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.709257] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4adf1aef-7e34-4dab-9c67-6586714c9d45 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 824.727020] env[61439]: DEBUG nova.compute.provider_tree [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 824.741485] env[61439]: DEBUG nova.scheduler.client.report [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 824.758567] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.336s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 824.759208] env[61439]: ERROR nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Traceback (most recent call last): [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self.driver.spawn(context, instance, image_meta, [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] vm_ref = self.build_virtual_machine(instance, [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] vif_infos = vmwarevif.get_vif_info(self._session, [ 824.759208] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] for vif in network_info: [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] return self._sync_wrapper(fn, *args, **kwargs) [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self.wait() [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self[:] = self._gt.wait() [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] return self._exit_event.wait() [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] result = hub.switch() [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 824.759558] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] return self.greenlet.switch() [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] result = function(*args, **kwargs) [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] return func(*args, **kwargs) [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] raise e [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] nwinfo = self.network_api.allocate_for_instance( [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] created_port_ids = self._update_ports_for_instance( [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] with excutils.save_and_reraise_exception(): [ 824.759951] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] self.force_reraise() [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] raise self.value [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] updated_port = self._update_port( [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] _ensure_no_port_binding_failure(port) [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] raise exception.PortBindingFailed(port_id=port['id']) [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] nova.exception.PortBindingFailed: Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. [ 824.760365] env[61439]: ERROR nova.compute.manager [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] [ 824.760673] env[61439]: DEBUG nova.compute.utils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 824.761141] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.155s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 824.763930] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Build of instance 282ecbf4-cd05-4ea0-bb3f-708969856b7e was re-scheduled: Binding failed for port 650ee4d0-5f8e-44be-b3d7-48701ce29ed7, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 824.764515] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 824.764722] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Acquiring lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 824.968376] env[61439]: ERROR nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. [ 824.968376] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 824.968376] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 824.968376] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 824.968376] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 824.968376] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 824.968376] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 824.968376] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 824.968376] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 824.968376] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 824.968376] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 824.968376] env[61439]: ERROR nova.compute.manager raise self.value [ 824.968376] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 824.968376] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 824.968376] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 824.968376] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 824.968858] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 824.968858] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 824.968858] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. [ 824.968858] env[61439]: ERROR nova.compute.manager [ 824.968858] env[61439]: Traceback (most recent call last): [ 824.968858] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 824.968858] env[61439]: listener.cb(fileno) [ 824.968858] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 824.968858] env[61439]: result = function(*args, **kwargs) [ 824.968858] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 824.968858] env[61439]: return func(*args, **kwargs) [ 824.968858] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 824.968858] env[61439]: raise e [ 824.968858] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 824.968858] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 824.968858] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 824.968858] env[61439]: created_port_ids = self._update_ports_for_instance( [ 824.968858] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 824.968858] env[61439]: with excutils.save_and_reraise_exception(): [ 824.968858] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 824.968858] env[61439]: self.force_reraise() [ 824.968858] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 824.968858] env[61439]: raise self.value [ 824.968858] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 824.968858] env[61439]: updated_port = self._update_port( [ 824.968858] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 824.968858] env[61439]: _ensure_no_port_binding_failure(port) [ 824.968858] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 824.968858] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 824.969759] env[61439]: nova.exception.PortBindingFailed: Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. [ 824.969759] env[61439]: Removing descriptor: 22 [ 824.969759] env[61439]: ERROR nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Traceback (most recent call last): [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] yield resources [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self.driver.spawn(context, instance, image_meta, [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 824.969759] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] vm_ref = self.build_virtual_machine(instance, [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] vif_infos = vmwarevif.get_vif_info(self._session, [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] for vif in network_info: [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] return self._sync_wrapper(fn, *args, **kwargs) [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self.wait() [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self[:] = self._gt.wait() [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] return self._exit_event.wait() [ 824.970119] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] result = hub.switch() [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] return self.greenlet.switch() [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] result = function(*args, **kwargs) [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] return func(*args, **kwargs) [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] raise e [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] nwinfo = self.network_api.allocate_for_instance( [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 824.970518] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] created_port_ids = self._update_ports_for_instance( [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] with excutils.save_and_reraise_exception(): [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self.force_reraise() [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] raise self.value [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] updated_port = self._update_port( [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] _ensure_no_port_binding_failure(port) [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 824.970869] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] raise exception.PortBindingFailed(port_id=port['id']) [ 824.971244] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] nova.exception.PortBindingFailed: Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. [ 824.971244] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] [ 824.971244] env[61439]: INFO nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Terminating instance [ 824.973322] env[61439]: DEBUG nova.network.neutron [req-6a8f047e-d42f-423e-b1de-0fbce188c377 req-d079442f-a5ef-455d-8fb5-0b30e2f7b94a service nova] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.976167] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Acquiring lock "refresh_cache-0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 824.976167] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Acquired lock "refresh_cache-0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 824.976167] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 824.992134] env[61439]: DEBUG oslo_concurrency.lockutils [req-6a8f047e-d42f-423e-b1de-0fbce188c377 req-d079442f-a5ef-455d-8fb5-0b30e2f7b94a service nova] Releasing lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 824.992915] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Acquired lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 824.993124] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 824.996256] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9a4442a-ddf6-4106-bd73-c35f791c3f71 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.011579] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9683887-d983-48ff-83c1-412cc0be1406 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.053756] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 825.053863] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4bf30cc-c94c-489f-ad74-0eea3b341b08 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.062097] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 825.064264] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c67561d-7ea4-4f86-bcd7-23f1206495d2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.080329] env[61439]: DEBUG nova.compute.provider_tree [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 825.092249] env[61439]: DEBUG nova.scheduler.client.report [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 825.106911] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.346s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 825.107480] env[61439]: ERROR nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Traceback (most recent call last): [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self.driver.spawn(context, instance, image_meta, [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self._vmops.spawn(context, instance, image_meta, injected_files, [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] vm_ref = self.build_virtual_machine(instance, [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] vif_infos = vmwarevif.get_vif_info(self._session, [ 825.107480] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] for vif in network_info: [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] return self._sync_wrapper(fn, *args, **kwargs) [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self.wait() [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self[:] = self._gt.wait() [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] return self._exit_event.wait() [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] result = hub.switch() [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 825.107907] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] return self.greenlet.switch() [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] result = function(*args, **kwargs) [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] return func(*args, **kwargs) [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] raise e [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] nwinfo = self.network_api.allocate_for_instance( [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] created_port_ids = self._update_ports_for_instance( [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] with excutils.save_and_reraise_exception(): [ 825.108428] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] self.force_reraise() [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] raise self.value [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] updated_port = self._update_port( [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] _ensure_no_port_binding_failure(port) [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] raise exception.PortBindingFailed(port_id=port['id']) [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] nova.exception.PortBindingFailed: Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. [ 825.108841] env[61439]: ERROR nova.compute.manager [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] [ 825.109530] env[61439]: DEBUG nova.compute.utils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 825.109785] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Build of instance eec2b74f-9b6c-4566-a0dc-da1fe9578715 was re-scheduled: Binding failed for port 410abb97-ddc2-44bc-8edb-f7e053fb7273, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 825.110146] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 825.110462] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "refresh_cache-eec2b74f-9b6c-4566-a0dc-da1fe9578715" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 825.110537] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquired lock "refresh_cache-eec2b74f-9b6c-4566-a0dc-da1fe9578715" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 825.110669] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 825.231120] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 825.492888] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 825.501576] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 825.508700] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Releasing lock "refresh_cache-0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 825.508883] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 825.509342] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 825.510016] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c51f0908-09de-47b5-8d46-5103f77f4066 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.513958] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Releasing lock "refresh_cache-282ecbf4-cd05-4ea0-bb3f-708969856b7e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 825.514199] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 825.514388] env[61439]: DEBUG nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 825.514686] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 825.523746] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f35924e4-3952-4870-af19-fecbf4d5a09d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.550856] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4 could not be found. [ 825.551123] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 825.551330] env[61439]: INFO nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 825.551752] env[61439]: DEBUG oslo.service.loopingcall [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 825.552124] env[61439]: DEBUG nova.compute.manager [-] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 825.552124] env[61439]: DEBUG nova.network.neutron [-] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 825.568335] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 825.577656] env[61439]: DEBUG nova.network.neutron [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 825.587498] env[61439]: INFO nova.compute.manager [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] [instance: 282ecbf4-cd05-4ea0-bb3f-708969856b7e] Took 0.07 seconds to deallocate network for instance. [ 825.604426] env[61439]: DEBUG nova.network.neutron [-] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 825.614320] env[61439]: DEBUG nova.network.neutron [-] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 825.631564] env[61439]: INFO nova.compute.manager [-] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Took 0.08 seconds to deallocate network for instance. [ 825.633875] env[61439]: DEBUG nova.compute.claims [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 825.634133] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 825.634374] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 825.719467] env[61439]: INFO nova.scheduler.client.report [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Deleted allocations for instance 282ecbf4-cd05-4ea0-bb3f-708969856b7e [ 825.746620] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8456ed09-09bb-4c88-a761-169e4179e901 tempest-ServerRescueTestJSONUnderV235-809091443 tempest-ServerRescueTestJSONUnderV235-809091443-project-member] Lock "282ecbf4-cd05-4ea0-bb3f-708969856b7e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.890s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 825.839479] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57e06832-5a92-4b14-b118-529c03dfa7ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.848152] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de9cf4ec-5dc9-475b-b95b-30ac921fe0ee {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.884414] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9119808-a978-4111-a463-411abae3f357 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.892435] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a631e72b-559f-4425-a711-de8f7170902d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 825.906461] env[61439]: DEBUG nova.compute.provider_tree [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 825.920899] env[61439]: DEBUG nova.scheduler.client.report [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 825.940354] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.305s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 825.940768] env[61439]: ERROR nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Traceback (most recent call last): [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self.driver.spawn(context, instance, image_meta, [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] vm_ref = self.build_virtual_machine(instance, [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] vif_infos = vmwarevif.get_vif_info(self._session, [ 825.940768] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] for vif in network_info: [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] return self._sync_wrapper(fn, *args, **kwargs) [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self.wait() [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self[:] = self._gt.wait() [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] return self._exit_event.wait() [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] result = hub.switch() [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 825.941194] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] return self.greenlet.switch() [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] result = function(*args, **kwargs) [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] return func(*args, **kwargs) [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] raise e [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] nwinfo = self.network_api.allocate_for_instance( [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] created_port_ids = self._update_ports_for_instance( [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] with excutils.save_and_reraise_exception(): [ 825.941539] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] self.force_reraise() [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] raise self.value [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] updated_port = self._update_port( [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] _ensure_no_port_binding_failure(port) [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] raise exception.PortBindingFailed(port_id=port['id']) [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] nova.exception.PortBindingFailed: Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. [ 825.941931] env[61439]: ERROR nova.compute.manager [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] [ 825.942241] env[61439]: DEBUG nova.compute.utils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 825.943517] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Build of instance 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4 was re-scheduled: Binding failed for port def2fbbb-a0c8-4947-8402-02a1fc787cff, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 825.943517] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 825.943765] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Acquiring lock "refresh_cache-0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 825.943949] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Acquired lock "refresh_cache-0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 825.944158] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 826.033878] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 826.048833] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Releasing lock "refresh_cache-eec2b74f-9b6c-4566-a0dc-da1fe9578715" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 826.049089] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 826.049277] env[61439]: DEBUG nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 826.049447] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 826.088732] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 826.378485] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 826.390433] env[61439]: DEBUG nova.network.neutron [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 826.412212] env[61439]: INFO nova.compute.manager [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: eec2b74f-9b6c-4566-a0dc-da1fe9578715] Took 0.36 seconds to deallocate network for instance. [ 826.568199] env[61439]: INFO nova.scheduler.client.report [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Deleted allocations for instance eec2b74f-9b6c-4566-a0dc-da1fe9578715 [ 826.609558] env[61439]: DEBUG oslo_concurrency.lockutils [None req-256876ce-f95d-4e2a-9b31-55c62edbd9ad tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "eec2b74f-9b6c-4566-a0dc-da1fe9578715" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.640s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 826.779589] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 826.798949] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Releasing lock "refresh_cache-0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 826.798949] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 826.798949] env[61439]: DEBUG nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 826.798949] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 826.877135] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 826.894321] env[61439]: DEBUG nova.network.neutron [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 826.910475] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Successfully created port: c11db4be-ceba-4564-a97e-7f4fc2d8afa7 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 826.916555] env[61439]: INFO nova.compute.manager [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] [instance: 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4] Took 0.12 seconds to deallocate network for instance. [ 827.046383] env[61439]: INFO nova.scheduler.client.report [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Deleted allocations for instance 0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4 [ 827.072485] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7f9b5c6e-cb7d-43d8-9e77-39cb639cc993 tempest-ServerPasswordTestJSON-1865709812 tempest-ServerPasswordTestJSON-1865709812-project-member] Lock "0bbf5ae6-c31d-4ea2-8581-6bed6c11c2d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.563s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 828.944107] env[61439]: DEBUG nova.compute.manager [req-076e5fef-692b-42cd-9263-6b833076057c req-45170dee-c6e0-426a-afc4-87fa478d1804 service nova] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Received event network-changed-28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 828.944107] env[61439]: DEBUG nova.compute.manager [req-076e5fef-692b-42cd-9263-6b833076057c req-45170dee-c6e0-426a-afc4-87fa478d1804 service nova] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Refreshing instance network info cache due to event network-changed-28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 828.944107] env[61439]: DEBUG oslo_concurrency.lockutils [req-076e5fef-692b-42cd-9263-6b833076057c req-45170dee-c6e0-426a-afc4-87fa478d1804 service nova] Acquiring lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 828.944107] env[61439]: DEBUG oslo_concurrency.lockutils [req-076e5fef-692b-42cd-9263-6b833076057c req-45170dee-c6e0-426a-afc4-87fa478d1804 service nova] Acquired lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 828.944107] env[61439]: DEBUG nova.network.neutron [req-076e5fef-692b-42cd-9263-6b833076057c req-45170dee-c6e0-426a-afc4-87fa478d1804 service nova] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Refreshing network info cache for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 829.127159] env[61439]: DEBUG nova.network.neutron [req-076e5fef-692b-42cd-9263-6b833076057c req-45170dee-c6e0-426a-afc4-87fa478d1804 service nova] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 829.203938] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 829.203938] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 829.203938] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 829.230836] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4527b287-d099-443c-a424-185d02054be0] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 829.231015] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 829.231160] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 829.231290] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 829.231419] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 829.232089] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 829.232361] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 829.232620] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 829.233081] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 829.331182] env[61439]: ERROR nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. [ 829.331182] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 829.331182] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 829.331182] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 829.331182] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 829.331182] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 829.331182] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 829.331182] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 829.331182] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 829.331182] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 829.331182] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 829.331182] env[61439]: ERROR nova.compute.manager raise self.value [ 829.331182] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 829.331182] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 829.331182] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 829.331182] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 829.331683] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 829.331683] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 829.331683] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. [ 829.331683] env[61439]: ERROR nova.compute.manager [ 829.331683] env[61439]: Traceback (most recent call last): [ 829.331683] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 829.331683] env[61439]: listener.cb(fileno) [ 829.331683] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 829.331683] env[61439]: result = function(*args, **kwargs) [ 829.331683] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 829.331683] env[61439]: return func(*args, **kwargs) [ 829.331683] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 829.331683] env[61439]: raise e [ 829.331683] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 829.331683] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 829.331683] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 829.331683] env[61439]: created_port_ids = self._update_ports_for_instance( [ 829.331683] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 829.331683] env[61439]: with excutils.save_and_reraise_exception(): [ 829.331683] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 829.331683] env[61439]: self.force_reraise() [ 829.331683] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 829.331683] env[61439]: raise self.value [ 829.331683] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 829.331683] env[61439]: updated_port = self._update_port( [ 829.331683] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 829.331683] env[61439]: _ensure_no_port_binding_failure(port) [ 829.331683] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 829.331683] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 829.332482] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. [ 829.332482] env[61439]: Removing descriptor: 20 [ 829.332482] env[61439]: ERROR nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Traceback (most recent call last): [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] yield resources [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self.driver.spawn(context, instance, image_meta, [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 829.332482] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] vm_ref = self.build_virtual_machine(instance, [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] vif_infos = vmwarevif.get_vif_info(self._session, [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] for vif in network_info: [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] return self._sync_wrapper(fn, *args, **kwargs) [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self.wait() [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self[:] = self._gt.wait() [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] return self._exit_event.wait() [ 829.332834] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] result = hub.switch() [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] return self.greenlet.switch() [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] result = function(*args, **kwargs) [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] return func(*args, **kwargs) [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] raise e [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] nwinfo = self.network_api.allocate_for_instance( [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 829.333241] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] created_port_ids = self._update_ports_for_instance( [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] with excutils.save_and_reraise_exception(): [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self.force_reraise() [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] raise self.value [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] updated_port = self._update_port( [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] _ensure_no_port_binding_failure(port) [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 829.333646] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] raise exception.PortBindingFailed(port_id=port['id']) [ 829.334027] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] nova.exception.PortBindingFailed: Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. [ 829.334027] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] [ 829.334027] env[61439]: INFO nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Terminating instance [ 829.335559] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 829.984374] env[61439]: DEBUG nova.network.neutron [req-076e5fef-692b-42cd-9263-6b833076057c req-45170dee-c6e0-426a-afc4-87fa478d1804 service nova] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 829.997538] env[61439]: DEBUG oslo_concurrency.lockutils [req-076e5fef-692b-42cd-9263-6b833076057c req-45170dee-c6e0-426a-afc4-87fa478d1804 service nova] Releasing lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 829.997727] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquired lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 829.997869] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 830.077520] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 830.202304] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 830.202540] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 830.215431] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 830.215978] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 830.215978] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.215978] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 830.217195] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5599ab13-e833-4321-8674-f93647fb51d5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.226748] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7eefca99-b0e6-4d6e-94c9-232025bc3d67 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.243465] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d07ffed0-adbb-49d7-80fe-c41affe2ccbe {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.251548] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf183cdd-97f1-4b65-a732-1e9e467b3c6d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.285030] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181526MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 830.285242] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 830.285397] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 830.386350] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4527b287-d099-443c-a424-185d02054be0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 830.386656] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 42ca8a89-5938-491b-b122-deac71d18505 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 830.386890] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance bf9101c9-4072-4f72-8ac3-24b7a5b88b45 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 830.387202] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance b779ccfb-9278-4a00-aadf-bd4afb0ab54a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 830.387476] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 50266004-15d2-46cf-9f48-315f24831d24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 830.387697] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance da0610f3-ab6b-4496-ba18-2794869a2831 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 830.387926] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7c3017dc-afd5-438e-ae23-ca0d7d4c01af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 830.388649] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 830.389413] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 830.533613] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59a6c423-672a-4667-9f28-566ec6b54f52 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.542948] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c520260-7339-4908-adeb-0058f35c1071 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.579534] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c881d83-7a21-4163-81b6-8290c70d9fcd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.588310] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6ca3991-f8d9-40ff-9354-d62157d6ce41 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.603544] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 830.613285] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 830.632263] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 830.632506] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.347s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.687055] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 830.699089] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Releasing lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 830.700677] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 830.700677] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 830.701163] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d33755d4-2972-4ed9-b456-0da843c8c1b0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.714124] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0527745e-a97f-45d8-a7a7-6469a484b0d4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.738980] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b779ccfb-9278-4a00-aadf-bd4afb0ab54a could not be found. [ 830.739239] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 830.739425] env[61439]: INFO nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 830.739686] env[61439]: DEBUG oslo.service.loopingcall [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 830.739932] env[61439]: DEBUG nova.compute.manager [-] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 830.740542] env[61439]: DEBUG nova.network.neutron [-] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 830.816245] env[61439]: DEBUG nova.network.neutron [-] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 830.828081] env[61439]: DEBUG nova.network.neutron [-] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 830.849186] env[61439]: INFO nova.compute.manager [-] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Took 0.11 seconds to deallocate network for instance. [ 830.851557] env[61439]: DEBUG nova.compute.claims [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 830.851751] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 830.851977] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 831.036484] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25253bc3-78b9-402d-8723-16d7601770c6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.047018] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e8069dc-45d7-46b8-a6b2-9f25329bd442 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.091368] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf650e51-6d7d-400e-b232-e14dd184f74a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.099084] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a44f5f47-ae6e-4238-b409-1aab45675ee8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.115168] env[61439]: DEBUG nova.compute.provider_tree [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 831.127073] env[61439]: DEBUG nova.scheduler.client.report [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 831.140089] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.288s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 831.140802] env[61439]: ERROR nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Traceback (most recent call last): [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self.driver.spawn(context, instance, image_meta, [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] vm_ref = self.build_virtual_machine(instance, [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] vif_infos = vmwarevif.get_vif_info(self._session, [ 831.140802] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] for vif in network_info: [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] return self._sync_wrapper(fn, *args, **kwargs) [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self.wait() [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self[:] = self._gt.wait() [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] return self._exit_event.wait() [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] result = hub.switch() [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 831.141202] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] return self.greenlet.switch() [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] result = function(*args, **kwargs) [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] return func(*args, **kwargs) [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] raise e [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] nwinfo = self.network_api.allocate_for_instance( [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] created_port_ids = self._update_ports_for_instance( [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] with excutils.save_and_reraise_exception(): [ 831.141614] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] self.force_reraise() [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] raise self.value [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] updated_port = self._update_port( [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] _ensure_no_port_binding_failure(port) [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] raise exception.PortBindingFailed(port_id=port['id']) [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] nova.exception.PortBindingFailed: Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. [ 831.142059] env[61439]: ERROR nova.compute.manager [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] [ 831.142437] env[61439]: DEBUG nova.compute.utils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 831.144155] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Build of instance b779ccfb-9278-4a00-aadf-bd4afb0ab54a was re-scheduled: Binding failed for port 28cf4c9f-fbd9-40b9-8aaf-326c2dc4a433, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 831.144155] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 831.144155] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 831.144361] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquired lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 831.144449] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 831.210878] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 831.678864] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 831.696108] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Releasing lock "refresh_cache-b779ccfb-9278-4a00-aadf-bd4afb0ab54a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 831.696371] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 831.696571] env[61439]: DEBUG nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 831.696742] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 831.800921] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 831.820682] env[61439]: DEBUG nova.network.neutron [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 831.840459] env[61439]: INFO nova.compute.manager [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: b779ccfb-9278-4a00-aadf-bd4afb0ab54a] Took 0.14 seconds to deallocate network for instance. [ 831.877835] env[61439]: ERROR nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. [ 831.877835] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 831.877835] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 831.877835] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 831.877835] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 831.877835] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 831.877835] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 831.877835] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 831.877835] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 831.877835] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 831.877835] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 831.877835] env[61439]: ERROR nova.compute.manager raise self.value [ 831.877835] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 831.877835] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 831.877835] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 831.877835] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 831.878339] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 831.878339] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 831.878339] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. [ 831.878339] env[61439]: ERROR nova.compute.manager [ 831.878339] env[61439]: Traceback (most recent call last): [ 831.878339] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 831.878339] env[61439]: listener.cb(fileno) [ 831.878339] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 831.878339] env[61439]: result = function(*args, **kwargs) [ 831.878339] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 831.878339] env[61439]: return func(*args, **kwargs) [ 831.878339] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 831.878339] env[61439]: raise e [ 831.878339] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 831.878339] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 831.878339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 831.878339] env[61439]: created_port_ids = self._update_ports_for_instance( [ 831.878339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 831.878339] env[61439]: with excutils.save_and_reraise_exception(): [ 831.878339] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 831.878339] env[61439]: self.force_reraise() [ 831.878339] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 831.878339] env[61439]: raise self.value [ 831.878339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 831.878339] env[61439]: updated_port = self._update_port( [ 831.878339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 831.878339] env[61439]: _ensure_no_port_binding_failure(port) [ 831.878339] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 831.878339] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 831.879188] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. [ 831.879188] env[61439]: Removing descriptor: 21 [ 831.879188] env[61439]: ERROR nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] Traceback (most recent call last): [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] yield resources [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self.driver.spawn(context, instance, image_meta, [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 831.879188] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] vm_ref = self.build_virtual_machine(instance, [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] vif_infos = vmwarevif.get_vif_info(self._session, [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] for vif in network_info: [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] return self._sync_wrapper(fn, *args, **kwargs) [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self.wait() [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self[:] = self._gt.wait() [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] return self._exit_event.wait() [ 831.879568] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] result = hub.switch() [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] return self.greenlet.switch() [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] result = function(*args, **kwargs) [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] return func(*args, **kwargs) [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] raise e [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] nwinfo = self.network_api.allocate_for_instance( [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 831.879968] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] created_port_ids = self._update_ports_for_instance( [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] with excutils.save_and_reraise_exception(): [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self.force_reraise() [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] raise self.value [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] updated_port = self._update_port( [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] _ensure_no_port_binding_failure(port) [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 831.880410] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] raise exception.PortBindingFailed(port_id=port['id']) [ 831.880742] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] nova.exception.PortBindingFailed: Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. [ 831.880742] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] [ 831.880742] env[61439]: INFO nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Terminating instance [ 831.882913] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 831.883153] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquired lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 831.883266] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 831.989781] env[61439]: INFO nova.scheduler.client.report [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Deleted allocations for instance b779ccfb-9278-4a00-aadf-bd4afb0ab54a [ 832.002174] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 832.023412] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2052ebab-d6dc-4f80-8812-ebfc42cea74e tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "b779ccfb-9278-4a00-aadf-bd4afb0ab54a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.618s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 832.303518] env[61439]: WARNING oslo_vmware.rw_handles [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 832.303518] env[61439]: ERROR oslo_vmware.rw_handles [ 832.303518] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 832.305563] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 832.305856] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Copying Virtual Disk [datastore2] vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/9fbdec98-5e34-4b3a-a04d-228e7511f58a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 832.306168] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a4c3267a-9ac0-4bec-be6b-cb7298c8381a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.321073] env[61439]: DEBUG oslo_vmware.api [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for the task: (returnval){ [ 832.321073] env[61439]: value = "task-987686" [ 832.321073] env[61439]: _type = "Task" [ 832.321073] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 832.329118] env[61439]: DEBUG oslo_vmware.api [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Task: {'id': task-987686, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 832.359952] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "a7144e03-407c-436c-9e22-1eeaeb43a210" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 832.361694] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "a7144e03-407c-436c-9e22-1eeaeb43a210" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 832.390831] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "2d8f035e-8a57-4937-a394-10d94a5630ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 832.391430] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "2d8f035e-8a57-4937-a394-10d94a5630ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 832.392572] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 832.411247] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 832.475409] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 832.475409] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 832.475409] env[61439]: INFO nova.compute.claims [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 832.478654] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 832.626648] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 832.626887] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 832.627064] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 832.628048] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 832.662674] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d3eba53-095a-4673-9866-fda9d11f3d45 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.671334] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b31501cb-771a-41f5-8d57-65f8995f14b9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.703031] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f791b9f1-2702-4909-998a-34d37c7d46ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.711223] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea865b9d-fac0-4c66-aead-efbe7229a2e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.725250] env[61439]: DEBUG nova.compute.provider_tree [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 832.740077] env[61439]: DEBUG nova.scheduler.client.report [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 832.753800] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 832.754441] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 832.757870] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.278s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 832.758770] env[61439]: INFO nova.compute.claims [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 832.816176] env[61439]: DEBUG nova.compute.utils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 832.819512] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 832.819695] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 832.840101] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 832.853664] env[61439]: DEBUG oslo_vmware.exceptions [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 832.854123] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 832.855546] env[61439]: ERROR nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 832.855546] env[61439]: Faults: ['InvalidArgument'] [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] Traceback (most recent call last): [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] yield resources [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] self.driver.spawn(context, instance, image_meta, [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] self._fetch_image_if_missing(context, vi) [ 832.855546] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] image_cache(vi, tmp_image_ds_loc) [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] vm_util.copy_virtual_disk( [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] session._wait_for_task(vmdk_copy_task) [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] return self.wait_for_task(task_ref) [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] return evt.wait() [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] result = hub.switch() [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 832.856493] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] return self.greenlet.switch() [ 832.857190] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 832.857190] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] self.f(*self.args, **self.kw) [ 832.857190] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 832.857190] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] raise exceptions.translate_fault(task_info.error) [ 832.857190] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 832.857190] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] Faults: ['InvalidArgument'] [ 832.857190] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] [ 832.857190] env[61439]: INFO nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Terminating instance [ 832.857570] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 832.857658] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 832.858185] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d49bbc81-0b78-45a6-8e93-580dbd7e51ee {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.860469] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "refresh_cache-4527b287-d099-443c-a424-185d02054be0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 832.860629] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired lock "refresh_cache-4527b287-d099-443c-a424-185d02054be0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 832.860798] env[61439]: DEBUG nova.network.neutron [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 832.881318] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 832.881318] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 832.881318] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-048267bb-44c8-48dc-bcf3-69adf0236c3a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.890075] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Waiting for the task: (returnval){ [ 832.890075] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c69662-480d-6517-b101-015e8219b2ea" [ 832.890075] env[61439]: _type = "Task" [ 832.890075] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 832.904319] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c69662-480d-6517-b101-015e8219b2ea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 832.964216] env[61439]: DEBUG nova.policy [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '919a60d5a73a4077b510084cc28a9499', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '278860d9a5104cebabf418408d8558d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 832.971014] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 833.003165] env[61439]: DEBUG nova.compute.manager [req-5605a970-d583-4f1b-b768-3aa52dfc9a0a req-45138769-c5f5-4127-8766-f039105dd087 service nova] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Received event network-changed-7284a82c-18a5-43a0-b4d9-67251e9955b7 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 833.003165] env[61439]: DEBUG nova.compute.manager [req-5605a970-d583-4f1b-b768-3aa52dfc9a0a req-45138769-c5f5-4127-8766-f039105dd087 service nova] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Refreshing instance network info cache due to event network-changed-7284a82c-18a5-43a0-b4d9-67251e9955b7. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 833.003165] env[61439]: DEBUG oslo_concurrency.lockutils [req-5605a970-d583-4f1b-b768-3aa52dfc9a0a req-45138769-c5f5-4127-8766-f039105dd087 service nova] Acquiring lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 833.005138] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 833.005367] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 833.005525] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 833.005708] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 833.005855] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 833.006012] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 833.006241] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 833.006379] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 833.006554] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 833.006717] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 833.006890] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 833.008367] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71052eef-788c-4ce0-9ca9-26aca52b4983 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.020701] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ddbea19-76b9-40f7-82b2-de5c8abcedcd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.049035] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48b6e096-8151-4120-b678-f96c3326e110 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.056298] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79eb6fcf-a76c-4b99-8f73-4302e7fc6c1c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.095653] env[61439]: DEBUG nova.network.neutron [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 833.098798] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7179eb94-aa33-4452-9716-da462aed09d2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.108085] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac1ff8b6-2268-400a-ad20-a0b5469736f1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.127366] env[61439]: DEBUG nova.compute.provider_tree [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 833.139062] env[61439]: DEBUG nova.scheduler.client.report [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 833.158940] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.402s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 833.159334] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 833.197655] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 833.212812] env[61439]: DEBUG nova.compute.utils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 833.212812] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 833.212812] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 833.228821] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 833.233308] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 833.280126] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 833.292240] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Releasing lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 833.292871] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 833.293718] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 833.294371] env[61439]: DEBUG oslo_concurrency.lockutils [req-5605a970-d583-4f1b-b768-3aa52dfc9a0a req-45138769-c5f5-4127-8766-f039105dd087 service nova] Acquired lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 833.294525] env[61439]: DEBUG nova.network.neutron [req-5605a970-d583-4f1b-b768-3aa52dfc9a0a req-45138769-c5f5-4127-8766-f039105dd087 service nova] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Refreshing network info cache for port 7284a82c-18a5-43a0-b4d9-67251e9955b7 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 833.295844] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a6d71722-4a75-45fc-b3f1-5b06ecf6d24a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.306646] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c04bf4a-2aa3-4bfa-8cae-82e61a172b5a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.330725] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "53400571-766c-4020-b163-87a8816199cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 833.330974] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "53400571-766c-4020-b163-87a8816199cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 833.338524] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 833.348176] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 50266004-15d2-46cf-9f48-315f24831d24 could not be found. [ 833.348176] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 833.348480] env[61439]: INFO nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Took 0.06 seconds to destroy the instance on the hypervisor. [ 833.348755] env[61439]: DEBUG oslo.service.loopingcall [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 833.349197] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 833.353370] env[61439]: DEBUG nova.compute.manager [-] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 833.353370] env[61439]: DEBUG nova.network.neutron [-] [instance: 50266004-15d2-46cf-9f48-315f24831d24] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 833.372238] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 833.372542] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 833.372695] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 833.372870] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 833.373021] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 833.373167] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 833.373404] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 833.373553] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 833.374266] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 833.374266] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 833.374266] env[61439]: DEBUG nova.virt.hardware [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 833.375014] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-413c150d-2e3c-41ae-93eb-54776e808127 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.384421] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9871303-9d74-40d7-870a-2d8d1df6111f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.412484] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 833.412863] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Creating directory with path [datastore2] vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 833.413417] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-54f26f04-ad01-44a7-acdd-d5aa1f07c0de {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.419962] env[61439]: DEBUG nova.network.neutron [-] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 833.428230] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 833.428379] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 833.432999] env[61439]: INFO nova.compute.claims [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 833.437344] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Created directory with path [datastore2] vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 833.437541] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Fetch image to [datastore2] vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 833.437708] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 833.438254] env[61439]: DEBUG nova.network.neutron [-] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 833.439985] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e0a7516-1f2d-4c32-b7a3-a115fbfe58dd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.449787] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ed546c8-8a8e-4a3b-950e-cb23654f1db2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.458245] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e06b1d6-0b65-4b3e-9eb7-91a31fead150 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.463373] env[61439]: INFO nova.compute.manager [-] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Took 0.11 seconds to deallocate network for instance. [ 833.466067] env[61439]: DEBUG nova.compute.claims [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 833.466189] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 833.497242] env[61439]: DEBUG nova.network.neutron [req-5605a970-d583-4f1b-b768-3aa52dfc9a0a req-45138769-c5f5-4127-8766-f039105dd087 service nova] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 833.505666] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b99d423a-a996-411d-b0a3-059e4b38e6ae {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.514490] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cf9db32a-df09-467a-8b49-326b79564df2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.540042] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 833.625597] env[61439]: DEBUG oslo_vmware.rw_handles [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 833.695341] env[61439]: DEBUG oslo_vmware.rw_handles [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 833.695341] env[61439]: DEBUG oslo_vmware.rw_handles [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 833.714341] env[61439]: DEBUG nova.policy [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '919a60d5a73a4077b510084cc28a9499', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '278860d9a5104cebabf418408d8558d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 833.723165] env[61439]: DEBUG nova.network.neutron [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 833.742736] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Releasing lock "refresh_cache-4527b287-d099-443c-a424-185d02054be0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 833.743379] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 833.743379] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 833.744529] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd535c10-4789-4f4f-93cd-027716ed37cb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.753994] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 833.754602] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-035bd032-20d0-4ecd-829d-f27f67523b3c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.777932] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f042a7-64ae-493f-9902-d1f6eb66e34b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.784366] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 833.784654] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 833.784809] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Deleting the datastore file [datastore2] 4527b287-d099-443c-a424-185d02054be0 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 833.785120] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9e6f93a6-bdd2-4274-8613-9b71c46af9f8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.790801] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52985ad2-14d3-4e58-911f-793f058c7213 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.795363] env[61439]: DEBUG oslo_vmware.api [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for the task: (returnval){ [ 833.795363] env[61439]: value = "task-987688" [ 833.795363] env[61439]: _type = "Task" [ 833.795363] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 833.826683] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ae7d334-7b89-472e-b3b8-52c0493d6646 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.834460] env[61439]: DEBUG oslo_vmware.api [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Task: {'id': task-987688, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034882} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 833.834460] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 833.834460] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 833.834460] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 833.834460] env[61439]: INFO nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Took 0.09 seconds to destroy the instance on the hypervisor. [ 833.834645] env[61439]: DEBUG oslo.service.loopingcall [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 833.834645] env[61439]: DEBUG nova.compute.manager [-] [instance: 4527b287-d099-443c-a424-185d02054be0] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 833.839360] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10e7282d-13c8-4f7d-8d4e-45bec1da30b5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.845339] env[61439]: DEBUG nova.compute.claims [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 833.845339] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 833.855238] env[61439]: DEBUG nova.compute.provider_tree [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 833.863944] env[61439]: DEBUG nova.scheduler.client.report [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 833.880852] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.452s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 833.881384] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 833.883824] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.418s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 833.916559] env[61439]: DEBUG nova.compute.utils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 833.917811] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 833.917986] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 833.925540] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 834.002019] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 834.037203] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 834.037455] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 834.037611] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 834.037793] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 834.037942] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 834.038342] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 834.038342] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 834.038508] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 834.038635] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 834.038793] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 834.038965] env[61439]: DEBUG nova.virt.hardware [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 834.040244] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b20b8cb-dbfa-4c47-8d15-0acd4f59a6a2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.048226] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aea2442-270e-424c-a074-df04c872fbec {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.088544] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-778bc89d-cac1-4aee-8347-8645b1903ed1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.096309] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5714c3fe-4df9-4a8e-863c-809adba7f378 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.127738] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65e2e9cd-8a54-402c-98a2-973a1b9a48c2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.135753] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8514ddf9-a508-4f92-80ca-50e9e6c0dc86 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.150995] env[61439]: DEBUG nova.compute.provider_tree [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 834.167731] env[61439]: DEBUG nova.scheduler.client.report [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 834.184736] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.301s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 834.185385] env[61439]: ERROR nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] Traceback (most recent call last): [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self.driver.spawn(context, instance, image_meta, [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] vm_ref = self.build_virtual_machine(instance, [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] vif_infos = vmwarevif.get_vif_info(self._session, [ 834.185385] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] for vif in network_info: [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] return self._sync_wrapper(fn, *args, **kwargs) [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self.wait() [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self[:] = self._gt.wait() [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] return self._exit_event.wait() [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] result = hub.switch() [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 834.185782] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] return self.greenlet.switch() [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] result = function(*args, **kwargs) [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] return func(*args, **kwargs) [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] raise e [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] nwinfo = self.network_api.allocate_for_instance( [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] created_port_ids = self._update_ports_for_instance( [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] with excutils.save_and_reraise_exception(): [ 834.186183] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] self.force_reraise() [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] raise self.value [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] updated_port = self._update_port( [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] _ensure_no_port_binding_failure(port) [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] raise exception.PortBindingFailed(port_id=port['id']) [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] nova.exception.PortBindingFailed: Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. [ 834.186595] env[61439]: ERROR nova.compute.manager [instance: 50266004-15d2-46cf-9f48-315f24831d24] [ 834.186945] env[61439]: DEBUG nova.compute.utils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 834.187203] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.343s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 834.189866] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Build of instance 50266004-15d2-46cf-9f48-315f24831d24 was re-scheduled: Binding failed for port 7284a82c-18a5-43a0-b4d9-67251e9955b7, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 834.190339] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 834.190560] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 834.406077] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0a76583-1be1-403f-9880-24768d7d7963 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.414622] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b440ad0-cea2-4ea7-a2cc-aee530a7fb1f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.453644] env[61439]: DEBUG nova.policy [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66924fb566fb455bada0c920137fb884', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3bab637ad53d433db6fb2017b6c0c2aa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 834.455808] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9026dff5-1dde-410c-a683-aa5d871b1cac {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.464906] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-142c7976-6469-4d99-8155-79a6de64b529 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 834.482942] env[61439]: DEBUG nova.compute.provider_tree [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 834.497032] env[61439]: DEBUG nova.scheduler.client.report [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 834.510533] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Successfully created port: 408f5613-8187-43fd-9e67-c27b5776065e {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 834.521415] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.334s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 834.522398] env[61439]: ERROR nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 834.522398] env[61439]: Faults: ['InvalidArgument'] [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] Traceback (most recent call last): [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] self.driver.spawn(context, instance, image_meta, [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] self._fetch_image_if_missing(context, vi) [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] image_cache(vi, tmp_image_ds_loc) [ 834.522398] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] vm_util.copy_virtual_disk( [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] session._wait_for_task(vmdk_copy_task) [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] return self.wait_for_task(task_ref) [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] return evt.wait() [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] result = hub.switch() [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] return self.greenlet.switch() [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 834.522821] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] self.f(*self.args, **self.kw) [ 834.523311] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 834.523311] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] raise exceptions.translate_fault(task_info.error) [ 834.523311] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 834.523311] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] Faults: ['InvalidArgument'] [ 834.523311] env[61439]: ERROR nova.compute.manager [instance: 4527b287-d099-443c-a424-185d02054be0] [ 834.523311] env[61439]: DEBUG nova.compute.utils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 834.526043] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Build of instance 4527b287-d099-443c-a424-185d02054be0 was re-scheduled: A specified parameter was not correct: fileType [ 834.526043] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 834.526043] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 834.526043] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquiring lock "refresh_cache-4527b287-d099-443c-a424-185d02054be0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 834.526043] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Acquired lock "refresh_cache-4527b287-d099-443c-a424-185d02054be0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 834.526282] env[61439]: DEBUG nova.network.neutron [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 834.575283] env[61439]: DEBUG nova.network.neutron [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 834.729877] env[61439]: DEBUG nova.network.neutron [req-5605a970-d583-4f1b-b768-3aa52dfc9a0a req-45138769-c5f5-4127-8766-f039105dd087 service nova] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 834.744135] env[61439]: DEBUG oslo_concurrency.lockutils [req-5605a970-d583-4f1b-b768-3aa52dfc9a0a req-45138769-c5f5-4127-8766-f039105dd087 service nova] Releasing lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 834.744135] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquired lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 834.744135] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 834.805362] env[61439]: DEBUG nova.network.neutron [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 834.816888] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Releasing lock "refresh_cache-4527b287-d099-443c-a424-185d02054be0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 834.817045] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 834.817471] env[61439]: DEBUG nova.compute.manager [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] [instance: 4527b287-d099-443c-a424-185d02054be0] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 834.941561] env[61439]: INFO nova.scheduler.client.report [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Deleted allocations for instance 4527b287-d099-443c-a424-185d02054be0 [ 834.968646] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cef5af9-8695-4847-a9ac-663a1cce5d4f tempest-ServerShowV247Test-985300212 tempest-ServerShowV247Test-985300212-project-member] Lock "4527b287-d099-443c-a424-185d02054be0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 132.205s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 835.019428] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "a0be018c-59be-4eb0-a3aa-b7814ce240e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.019674] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "a0be018c-59be-4eb0-a3aa-b7814ce240e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.029174] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 835.088026] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 835.152038] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.152468] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.154786] env[61439]: INFO nova.compute.claims [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 835.201333] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 835.296405] env[61439]: ERROR nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. [ 835.296405] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 835.296405] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 835.296405] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 835.296405] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 835.296405] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 835.296405] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 835.296405] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 835.296405] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 835.296405] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 835.296405] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 835.296405] env[61439]: ERROR nova.compute.manager raise self.value [ 835.296405] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 835.296405] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 835.296405] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 835.296405] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 835.296937] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 835.296937] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 835.296937] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. [ 835.296937] env[61439]: ERROR nova.compute.manager [ 835.296937] env[61439]: Traceback (most recent call last): [ 835.296937] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 835.296937] env[61439]: listener.cb(fileno) [ 835.296937] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 835.296937] env[61439]: result = function(*args, **kwargs) [ 835.296937] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 835.296937] env[61439]: return func(*args, **kwargs) [ 835.296937] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 835.296937] env[61439]: raise e [ 835.296937] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 835.296937] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 835.296937] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 835.296937] env[61439]: created_port_ids = self._update_ports_for_instance( [ 835.296937] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 835.296937] env[61439]: with excutils.save_and_reraise_exception(): [ 835.296937] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 835.296937] env[61439]: self.force_reraise() [ 835.296937] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 835.296937] env[61439]: raise self.value [ 835.296937] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 835.296937] env[61439]: updated_port = self._update_port( [ 835.296937] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 835.296937] env[61439]: _ensure_no_port_binding_failure(port) [ 835.296937] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 835.296937] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 835.297770] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. [ 835.297770] env[61439]: Removing descriptor: 24 [ 835.297770] env[61439]: ERROR nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Traceback (most recent call last): [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] yield resources [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self.driver.spawn(context, instance, image_meta, [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self._vmops.spawn(context, instance, image_meta, injected_files, [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 835.297770] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] vm_ref = self.build_virtual_machine(instance, [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] vif_infos = vmwarevif.get_vif_info(self._session, [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] for vif in network_info: [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] return self._sync_wrapper(fn, *args, **kwargs) [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self.wait() [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self[:] = self._gt.wait() [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] return self._exit_event.wait() [ 835.298136] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] result = hub.switch() [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] return self.greenlet.switch() [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] result = function(*args, **kwargs) [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] return func(*args, **kwargs) [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] raise e [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] nwinfo = self.network_api.allocate_for_instance( [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 835.298539] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] created_port_ids = self._update_ports_for_instance( [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] with excutils.save_and_reraise_exception(): [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self.force_reraise() [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] raise self.value [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] updated_port = self._update_port( [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] _ensure_no_port_binding_failure(port) [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 835.298908] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] raise exception.PortBindingFailed(port_id=port['id']) [ 835.299247] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] nova.exception.PortBindingFailed: Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. [ 835.299247] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] [ 835.299247] env[61439]: INFO nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Terminating instance [ 835.300134] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 835.300312] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 835.301036] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 835.389972] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b354bcf-64ff-42ba-bd9b-1041e2688782 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.398468] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ee02010-c1da-45ec-b3eb-ac7a58fb3d76 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.432106] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 835.435133] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dddad6cc-e7fe-4e46-9b08-85bb7b5a74c7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.443324] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cc375a5-5289-40c1-aa7f-4dc8f3463283 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.461476] env[61439]: DEBUG nova.compute.provider_tree [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 835.476140] env[61439]: DEBUG nova.scheduler.client.report [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 835.499202] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 835.499707] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 835.547027] env[61439]: DEBUG nova.compute.utils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 835.548347] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 835.548588] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 835.564515] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 835.617768] env[61439]: DEBUG nova.compute.manager [req-c3f0e7e4-6f5b-48d7-8206-0cf2d06e6b77 req-37986425-e3ed-4d05-aebc-88e723e0e0d4 service nova] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Received event network-changed-9c39d9bd-de1a-462d-9218-e51ebfb16c02 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 835.617970] env[61439]: DEBUG nova.compute.manager [req-c3f0e7e4-6f5b-48d7-8206-0cf2d06e6b77 req-37986425-e3ed-4d05-aebc-88e723e0e0d4 service nova] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Refreshing instance network info cache due to event network-changed-9c39d9bd-de1a-462d-9218-e51ebfb16c02. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 835.618181] env[61439]: DEBUG oslo_concurrency.lockutils [req-c3f0e7e4-6f5b-48d7-8206-0cf2d06e6b77 req-37986425-e3ed-4d05-aebc-88e723e0e0d4 service nova] Acquiring lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 835.667160] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 835.702646] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 835.702929] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 835.703127] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 835.703386] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 835.703584] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 835.703771] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 835.704031] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 835.704230] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 835.704440] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 835.704635] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 835.704845] env[61439]: DEBUG nova.virt.hardware [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 835.706162] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0611eb16-727e-4163-93ec-d02e5939ff29 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.715014] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f9e314e-4e73-444a-8258-64a785ce4dff {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.866972] env[61439]: DEBUG nova.policy [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '027b7525da4940fe841f1f585b852b04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '063546fa9ba840158e484a974edfd79a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 835.911558] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "95a16efa-3240-4e15-ae19-03aaef61e2de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 835.911558] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "95a16efa-3240-4e15-ae19-03aaef61e2de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.926029] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 836.011039] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.011370] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 836.013067] env[61439]: INFO nova.compute.claims [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 836.176591] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "6288806c-d634-4749-8538-7188954788f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.176812] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "6288806c-d634-4749-8538-7188954788f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 836.271491] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9acc8a5-b1bd-49d8-bbe2-9b69f9bcacf0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.279450] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b73a26b-7124-4352-866c-ad9897a1a5d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.284562] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 836.316866] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8017a255-5ed4-431b-bfc2-71a392665ba6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.318583] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Releasing lock "refresh_cache-50266004-15d2-46cf-9f48-315f24831d24" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 836.318797] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 836.319566] env[61439]: DEBUG nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 836.319754] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 836.328464] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-996a0c32-f224-4063-81f6-b0e35a3eb622 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.345495] env[61439]: DEBUG nova.compute.provider_tree [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 836.357663] env[61439]: DEBUG nova.scheduler.client.report [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 836.385140] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 836.385307] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 836.442315] env[61439]: DEBUG nova.compute.utils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 836.447018] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 836.447117] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 836.447604] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 836.456184] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 836.459870] env[61439]: DEBUG nova.network.neutron [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 836.469234] env[61439]: INFO nova.compute.manager [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: 50266004-15d2-46cf-9f48-315f24831d24] Took 0.15 seconds to deallocate network for instance. [ 836.572942] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 836.592036] env[61439]: INFO nova.scheduler.client.report [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Deleted allocations for instance 50266004-15d2-46cf-9f48-315f24831d24 [ 836.606944] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 836.619475] env[61439]: DEBUG oslo_concurrency.lockutils [None req-abf94687-081c-4338-9ec4-e622a3c61eb0 tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "50266004-15d2-46cf-9f48-315f24831d24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 21.304s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 836.626527] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 836.626677] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 836.627421] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 836.627421] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 836.627421] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 836.627421] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 836.627421] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 836.628117] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 836.628117] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 836.628117] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 836.628232] env[61439]: DEBUG nova.virt.hardware [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 836.629283] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23d541f6-21ec-41fa-89da-1753e8d5e964 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.634137] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 836.634657] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 836.634768] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 836.636336] env[61439]: DEBUG oslo_concurrency.lockutils [req-c3f0e7e4-6f5b-48d7-8206-0cf2d06e6b77 req-37986425-e3ed-4d05-aebc-88e723e0e0d4 service nova] Acquired lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 836.636421] env[61439]: DEBUG nova.network.neutron [req-c3f0e7e4-6f5b-48d7-8206-0cf2d06e6b77 req-37986425-e3ed-4d05-aebc-88e723e0e0d4 service nova] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Refreshing network info cache for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 836.637730] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8c3adc5d-0a31-4f98-a7cb-2d666d0b2594 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.639742] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 836.648195] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6527826-8fe0-4e2d-a99f-d061abe67497 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.662762] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-813eda7c-f960-439e-a7b2-1ee9cafb26a8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.699850] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance da0610f3-ab6b-4496-ba18-2794869a2831 could not be found. [ 836.700100] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 836.700307] env[61439]: INFO nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Took 0.07 seconds to destroy the instance on the hypervisor. [ 836.700587] env[61439]: DEBUG oslo.service.loopingcall [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 836.700818] env[61439]: DEBUG nova.compute.manager [-] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 836.700910] env[61439]: DEBUG nova.network.neutron [-] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 836.712066] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.712344] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 836.713849] env[61439]: INFO nova.compute.claims [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 836.785157] env[61439]: DEBUG nova.network.neutron [req-c3f0e7e4-6f5b-48d7-8206-0cf2d06e6b77 req-37986425-e3ed-4d05-aebc-88e723e0e0d4 service nova] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 836.847552] env[61439]: DEBUG nova.network.neutron [-] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 836.858919] env[61439]: DEBUG nova.network.neutron [-] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 836.864828] env[61439]: DEBUG nova.policy [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c468b82bd9d64e19b419a393fff4af06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f86a86563cc047459d3e7c0553c82c63', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 836.873933] env[61439]: INFO nova.compute.manager [-] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Took 0.17 seconds to deallocate network for instance. [ 836.876527] env[61439]: DEBUG nova.compute.claims [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 836.876714] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.949693] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b0e50a1-6bd4-4c51-8987-832c1c90dc6a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.961992] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b202c63-0796-46ec-a848-82cfce64b53e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.010271] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caca17c0-72eb-491a-a102-52f436020fd1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.018736] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-488f6329-b2c4-497f-8f76-5e0ee1956a79 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.033402] env[61439]: DEBUG nova.compute.provider_tree [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 837.045583] env[61439]: DEBUG nova.scheduler.client.report [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 837.065132] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.065434] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 837.068796] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.191s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 837.114493] env[61439]: DEBUG nova.compute.utils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 837.115825] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 837.115970] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 837.135832] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 837.222966] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 837.230941] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Successfully created port: 4f5def34-c667-41db-ab3e-33cee4e6b03a {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 837.256432] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 837.256685] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 837.256839] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 837.257037] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 837.257171] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 837.257317] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 837.257526] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 837.257684] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 837.257851] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 837.258019] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 837.258279] env[61439]: DEBUG nova.virt.hardware [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 837.259107] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d2a9b02-b14b-4305-92e6-52d06c03aa82 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.270761] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d60cefd2-b36a-4ad9-83e6-ed8807befb55 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.300189] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd8331a8-a055-4928-89ac-2ac859f736f9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.307655] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b1bb5b-91d9-4359-a6c5-ee9e4e20e9e1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.343025] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f767e1c-8237-4718-9c0c-d4d8040d5688 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.351957] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e6894cd-3e60-4057-be3c-76589155f88c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.373020] env[61439]: DEBUG nova.compute.provider_tree [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 837.383916] env[61439]: DEBUG nova.scheduler.client.report [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 837.402231] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.334s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.402872] env[61439]: ERROR nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Traceback (most recent call last): [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self.driver.spawn(context, instance, image_meta, [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self._vmops.spawn(context, instance, image_meta, injected_files, [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] vm_ref = self.build_virtual_machine(instance, [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] vif_infos = vmwarevif.get_vif_info(self._session, [ 837.402872] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] for vif in network_info: [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] return self._sync_wrapper(fn, *args, **kwargs) [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self.wait() [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self[:] = self._gt.wait() [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] return self._exit_event.wait() [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] result = hub.switch() [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 837.403356] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] return self.greenlet.switch() [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] result = function(*args, **kwargs) [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] return func(*args, **kwargs) [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] raise e [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] nwinfo = self.network_api.allocate_for_instance( [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] created_port_ids = self._update_ports_for_instance( [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] with excutils.save_and_reraise_exception(): [ 837.403775] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] self.force_reraise() [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] raise self.value [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] updated_port = self._update_port( [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] _ensure_no_port_binding_failure(port) [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] raise exception.PortBindingFailed(port_id=port['id']) [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] nova.exception.PortBindingFailed: Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. [ 837.404264] env[61439]: ERROR nova.compute.manager [instance: da0610f3-ab6b-4496-ba18-2794869a2831] [ 837.404604] env[61439]: DEBUG nova.compute.utils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 837.405877] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Build of instance da0610f3-ab6b-4496-ba18-2794869a2831 was re-scheduled: Binding failed for port 9c39d9bd-de1a-462d-9218-e51ebfb16c02, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 837.405877] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 837.405877] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 837.474323] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Successfully created port: fd9d5f30-56bf-48b3-ab8b-02cf36c2a603 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 837.483287] env[61439]: DEBUG nova.policy [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '027b7525da4940fe841f1f585b852b04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '063546fa9ba840158e484a974edfd79a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 837.902552] env[61439]: DEBUG nova.network.neutron [req-c3f0e7e4-6f5b-48d7-8206-0cf2d06e6b77 req-37986425-e3ed-4d05-aebc-88e723e0e0d4 service nova] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 837.917175] env[61439]: DEBUG oslo_concurrency.lockutils [req-c3f0e7e4-6f5b-48d7-8206-0cf2d06e6b77 req-37986425-e3ed-4d05-aebc-88e723e0e0d4 service nova] Releasing lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 837.917730] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 837.917810] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 837.961132] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Successfully created port: 42f49d5f-4672-4e9e-9e10-a5ab004efa63 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 838.003770] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 838.576332] env[61439]: DEBUG nova.compute.manager [req-c2f9dd7f-7775-41eb-b521-a4e464df39fe req-9d967532-a68d-41b9-9cc1-746e3f5b07d3 service nova] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Received event network-changed-c11db4be-ceba-4564-a97e-7f4fc2d8afa7 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 838.576332] env[61439]: DEBUG nova.compute.manager [req-c2f9dd7f-7775-41eb-b521-a4e464df39fe req-9d967532-a68d-41b9-9cc1-746e3f5b07d3 service nova] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Refreshing instance network info cache due to event network-changed-c11db4be-ceba-4564-a97e-7f4fc2d8afa7. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 838.576332] env[61439]: DEBUG oslo_concurrency.lockutils [req-c2f9dd7f-7775-41eb-b521-a4e464df39fe req-9d967532-a68d-41b9-9cc1-746e3f5b07d3 service nova] Acquiring lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 838.576332] env[61439]: DEBUG oslo_concurrency.lockutils [req-c2f9dd7f-7775-41eb-b521-a4e464df39fe req-9d967532-a68d-41b9-9cc1-746e3f5b07d3 service nova] Acquired lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 838.576722] env[61439]: DEBUG nova.network.neutron [req-c2f9dd7f-7775-41eb-b521-a4e464df39fe req-9d967532-a68d-41b9-9cc1-746e3f5b07d3 service nova] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Refreshing network info cache for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 838.828235] env[61439]: DEBUG nova.network.neutron [req-c2f9dd7f-7775-41eb-b521-a4e464df39fe req-9d967532-a68d-41b9-9cc1-746e3f5b07d3 service nova] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 839.038580] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 839.053016] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-da0610f3-ab6b-4496-ba18-2794869a2831" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 839.053016] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 839.053016] env[61439]: DEBUG nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 839.053016] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 839.151440] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 839.163646] env[61439]: DEBUG nova.network.neutron [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 839.184224] env[61439]: INFO nova.compute.manager [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: da0610f3-ab6b-4496-ba18-2794869a2831] Took 0.13 seconds to deallocate network for instance. [ 839.259544] env[61439]: ERROR nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. [ 839.259544] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 839.259544] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 839.259544] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 839.259544] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 839.259544] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 839.259544] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 839.259544] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 839.259544] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 839.259544] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 839.259544] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 839.259544] env[61439]: ERROR nova.compute.manager raise self.value [ 839.259544] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 839.259544] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 839.259544] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 839.259544] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 839.260044] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 839.260044] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 839.260044] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. [ 839.260044] env[61439]: ERROR nova.compute.manager [ 839.260044] env[61439]: Traceback (most recent call last): [ 839.260044] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 839.260044] env[61439]: listener.cb(fileno) [ 839.260044] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 839.260044] env[61439]: result = function(*args, **kwargs) [ 839.260044] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 839.260044] env[61439]: return func(*args, **kwargs) [ 839.260044] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 839.260044] env[61439]: raise e [ 839.260044] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 839.260044] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 839.260044] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 839.260044] env[61439]: created_port_ids = self._update_ports_for_instance( [ 839.260044] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 839.260044] env[61439]: with excutils.save_and_reraise_exception(): [ 839.260044] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 839.260044] env[61439]: self.force_reraise() [ 839.260044] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 839.260044] env[61439]: raise self.value [ 839.260044] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 839.260044] env[61439]: updated_port = self._update_port( [ 839.260044] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 839.260044] env[61439]: _ensure_no_port_binding_failure(port) [ 839.260044] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 839.260044] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 839.260965] env[61439]: nova.exception.PortBindingFailed: Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. [ 839.260965] env[61439]: Removing descriptor: 23 [ 839.260965] env[61439]: ERROR nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Traceback (most recent call last): [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] yield resources [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self.driver.spawn(context, instance, image_meta, [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self._vmops.spawn(context, instance, image_meta, injected_files, [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 839.260965] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] vm_ref = self.build_virtual_machine(instance, [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] vif_infos = vmwarevif.get_vif_info(self._session, [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] for vif in network_info: [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] return self._sync_wrapper(fn, *args, **kwargs) [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self.wait() [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self[:] = self._gt.wait() [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] return self._exit_event.wait() [ 839.261422] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] result = hub.switch() [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] return self.greenlet.switch() [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] result = function(*args, **kwargs) [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] return func(*args, **kwargs) [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] raise e [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] nwinfo = self.network_api.allocate_for_instance( [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 839.261832] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] created_port_ids = self._update_ports_for_instance( [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] with excutils.save_and_reraise_exception(): [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self.force_reraise() [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] raise self.value [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] updated_port = self._update_port( [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] _ensure_no_port_binding_failure(port) [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 839.263036] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] raise exception.PortBindingFailed(port_id=port['id']) [ 839.263441] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] nova.exception.PortBindingFailed: Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. [ 839.263441] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] [ 839.263441] env[61439]: INFO nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Terminating instance [ 839.263822] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 839.333219] env[61439]: INFO nova.scheduler.client.report [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleted allocations for instance da0610f3-ab6b-4496-ba18-2794869a2831 [ 839.380079] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b5a007fe-b64a-46f0-ac66-8994d058b753 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "da0610f3-ab6b-4496-ba18-2794869a2831" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 18.855s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 839.903755] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Successfully created port: 08566279-b49f-411a-97b7-79b249e6fbac {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 839.934097] env[61439]: DEBUG nova.network.neutron [req-c2f9dd7f-7775-41eb-b521-a4e464df39fe req-9d967532-a68d-41b9-9cc1-746e3f5b07d3 service nova] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 839.945934] env[61439]: DEBUG oslo_concurrency.lockutils [req-c2f9dd7f-7775-41eb-b521-a4e464df39fe req-9d967532-a68d-41b9-9cc1-746e3f5b07d3 service nova] Releasing lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 839.946428] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 839.946593] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 840.072558] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 840.149741] env[61439]: DEBUG nova.compute.manager [req-61542e02-0250-4f20-a84b-a19ad969de08 req-557b5568-30d7-4738-86b2-54f1767837b2 service nova] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Received event network-changed-408f5613-8187-43fd-9e67-c27b5776065e {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 840.149944] env[61439]: DEBUG nova.compute.manager [req-61542e02-0250-4f20-a84b-a19ad969de08 req-557b5568-30d7-4738-86b2-54f1767837b2 service nova] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Refreshing instance network info cache due to event network-changed-408f5613-8187-43fd-9e67-c27b5776065e. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 840.150198] env[61439]: DEBUG oslo_concurrency.lockutils [req-61542e02-0250-4f20-a84b-a19ad969de08 req-557b5568-30d7-4738-86b2-54f1767837b2 service nova] Acquiring lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 840.150394] env[61439]: DEBUG oslo_concurrency.lockutils [req-61542e02-0250-4f20-a84b-a19ad969de08 req-557b5568-30d7-4738-86b2-54f1767837b2 service nova] Acquired lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 840.150538] env[61439]: DEBUG nova.network.neutron [req-61542e02-0250-4f20-a84b-a19ad969de08 req-557b5568-30d7-4738-86b2-54f1767837b2 service nova] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Refreshing network info cache for port 408f5613-8187-43fd-9e67-c27b5776065e {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 840.258409] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Successfully created port: da804bde-daba-479b-ab5f-0742b2a78cec {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 840.332816] env[61439]: DEBUG nova.network.neutron [req-61542e02-0250-4f20-a84b-a19ad969de08 req-557b5568-30d7-4738-86b2-54f1767837b2 service nova] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 840.423815] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Successfully created port: 012018bf-5756-4f0d-b30d-4289d62d6c2f {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 841.037733] env[61439]: DEBUG nova.network.neutron [req-61542e02-0250-4f20-a84b-a19ad969de08 req-557b5568-30d7-4738-86b2-54f1767837b2 service nova] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 841.055342] env[61439]: DEBUG oslo_concurrency.lockutils [req-61542e02-0250-4f20-a84b-a19ad969de08 req-557b5568-30d7-4738-86b2-54f1767837b2 service nova] Releasing lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 841.234698] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 841.251704] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 841.252140] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 841.252366] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 841.252936] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a51b9c22-25aa-4120-b82e-4432e9abc574 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.266111] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41d8e3b5-e2bd-4282-8ede-1752f3ead559 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.297622] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7c3017dc-afd5-438e-ae23-ca0d7d4c01af could not be found. [ 841.297888] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 841.298088] env[61439]: INFO nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Took 0.05 seconds to destroy the instance on the hypervisor. [ 841.298348] env[61439]: DEBUG oslo.service.loopingcall [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 841.298834] env[61439]: DEBUG nova.compute.manager [-] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 841.298834] env[61439]: DEBUG nova.network.neutron [-] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 841.302598] env[61439]: ERROR nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. [ 841.302598] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 841.302598] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 841.302598] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 841.302598] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 841.302598] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 841.302598] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 841.302598] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 841.302598] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 841.302598] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 841.302598] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 841.302598] env[61439]: ERROR nova.compute.manager raise self.value [ 841.302598] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 841.302598] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 841.302598] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 841.302598] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 841.303165] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 841.303165] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 841.303165] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. [ 841.303165] env[61439]: ERROR nova.compute.manager [ 841.303165] env[61439]: Traceback (most recent call last): [ 841.303165] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 841.303165] env[61439]: listener.cb(fileno) [ 841.303165] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 841.303165] env[61439]: result = function(*args, **kwargs) [ 841.303165] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 841.303165] env[61439]: return func(*args, **kwargs) [ 841.303165] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 841.303165] env[61439]: raise e [ 841.303165] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 841.303165] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 841.303165] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 841.303165] env[61439]: created_port_ids = self._update_ports_for_instance( [ 841.303165] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 841.303165] env[61439]: with excutils.save_and_reraise_exception(): [ 841.303165] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 841.303165] env[61439]: self.force_reraise() [ 841.303165] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 841.303165] env[61439]: raise self.value [ 841.303165] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 841.303165] env[61439]: updated_port = self._update_port( [ 841.303165] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 841.303165] env[61439]: _ensure_no_port_binding_failure(port) [ 841.303165] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 841.303165] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 841.304032] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. [ 841.304032] env[61439]: Removing descriptor: 10 [ 841.304032] env[61439]: ERROR nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Traceback (most recent call last): [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] yield resources [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self.driver.spawn(context, instance, image_meta, [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self._vmops.spawn(context, instance, image_meta, injected_files, [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 841.304032] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] vm_ref = self.build_virtual_machine(instance, [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] vif_infos = vmwarevif.get_vif_info(self._session, [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] for vif in network_info: [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] return self._sync_wrapper(fn, *args, **kwargs) [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self.wait() [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self[:] = self._gt.wait() [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] return self._exit_event.wait() [ 841.304436] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] result = hub.switch() [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] return self.greenlet.switch() [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] result = function(*args, **kwargs) [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] return func(*args, **kwargs) [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] raise e [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] nwinfo = self.network_api.allocate_for_instance( [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 841.304831] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] created_port_ids = self._update_ports_for_instance( [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] with excutils.save_and_reraise_exception(): [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self.force_reraise() [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] raise self.value [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] updated_port = self._update_port( [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] _ensure_no_port_binding_failure(port) [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 841.305343] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] raise exception.PortBindingFailed(port_id=port['id']) [ 841.305720] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] nova.exception.PortBindingFailed: Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. [ 841.305720] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] [ 841.305720] env[61439]: INFO nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Terminating instance [ 841.305824] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 841.306134] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquired lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 841.306305] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 841.392536] env[61439]: DEBUG nova.network.neutron [-] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 841.403103] env[61439]: DEBUG nova.network.neutron [-] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 841.419806] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 841.423634] env[61439]: INFO nova.compute.manager [-] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Took 0.12 seconds to deallocate network for instance. [ 841.428728] env[61439]: DEBUG nova.compute.claims [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 841.428950] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 841.429188] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 841.640710] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64fddf14-1b27-43c1-badd-8e0d5fbe6022 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.647958] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed072be0-11f3-4618-8e56-4204ba69f0a0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.683730] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97209aa1-5c9c-4ff0-a3c2-e5862fda0264 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.691394] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15cfbe7f-8e06-4842-80c5-43d8c7810e1c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.705815] env[61439]: DEBUG nova.compute.provider_tree [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 841.716458] env[61439]: DEBUG nova.scheduler.client.report [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 841.732337] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.303s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 841.733303] env[61439]: ERROR nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Traceback (most recent call last): [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self.driver.spawn(context, instance, image_meta, [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self._vmops.spawn(context, instance, image_meta, injected_files, [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] vm_ref = self.build_virtual_machine(instance, [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] vif_infos = vmwarevif.get_vif_info(self._session, [ 841.733303] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] for vif in network_info: [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] return self._sync_wrapper(fn, *args, **kwargs) [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self.wait() [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self[:] = self._gt.wait() [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] return self._exit_event.wait() [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] result = hub.switch() [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 841.733799] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] return self.greenlet.switch() [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] result = function(*args, **kwargs) [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] return func(*args, **kwargs) [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] raise e [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] nwinfo = self.network_api.allocate_for_instance( [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] created_port_ids = self._update_ports_for_instance( [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] with excutils.save_and_reraise_exception(): [ 841.734219] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] self.force_reraise() [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] raise self.value [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] updated_port = self._update_port( [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] _ensure_no_port_binding_failure(port) [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] raise exception.PortBindingFailed(port_id=port['id']) [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] nova.exception.PortBindingFailed: Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. [ 841.734715] env[61439]: ERROR nova.compute.manager [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] [ 841.735098] env[61439]: DEBUG nova.compute.utils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 841.735957] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Build of instance 7c3017dc-afd5-438e-ae23-ca0d7d4c01af was re-scheduled: Binding failed for port c11db4be-ceba-4564-a97e-7f4fc2d8afa7, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 841.736443] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 841.736709] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquiring lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 841.736890] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Acquired lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 841.737282] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 841.869916] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 842.100508] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Successfully created port: fc4313b5-9624-4bf9-b40d-f1080c81c289 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 842.466682] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 842.477991] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Releasing lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 842.478399] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 842.478596] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 842.479573] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-103235e7-ef48-4176-9e89-7459cef2de20 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 842.489232] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6e9c0cf-828a-4b09-903a-2dd9a6bdb18e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 842.512240] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a7144e03-407c-436c-9e22-1eeaeb43a210 could not be found. [ 842.512579] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 842.512818] env[61439]: INFO nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Took 0.03 seconds to destroy the instance on the hypervisor. [ 842.513352] env[61439]: DEBUG oslo.service.loopingcall [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 842.513622] env[61439]: DEBUG nova.compute.manager [-] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 842.513768] env[61439]: DEBUG nova.network.neutron [-] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 842.890292] env[61439]: DEBUG nova.network.neutron [-] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 842.904587] env[61439]: DEBUG nova.network.neutron [-] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 842.918187] env[61439]: INFO nova.compute.manager [-] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Took 0.40 seconds to deallocate network for instance. [ 842.923725] env[61439]: DEBUG nova.compute.claims [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 842.923965] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 842.924283] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 842.947330] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 842.962677] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Releasing lock "refresh_cache-7c3017dc-afd5-438e-ae23-ca0d7d4c01af" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 842.962809] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 842.963278] env[61439]: DEBUG nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 842.963278] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 843.065903] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 843.077718] env[61439]: DEBUG nova.network.neutron [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 843.090595] env[61439]: INFO nova.compute.manager [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] [instance: 7c3017dc-afd5-438e-ae23-ca0d7d4c01af] Took 0.13 seconds to deallocate network for instance. [ 843.159155] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f1c6807-82dc-4104-845f-4f9e930036c5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.168869] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33f98c14-dc02-4e54-9bea-7ef89a9170d9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.212130] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42e6ad10-4a53-433b-ac8e-1ab7d6ec551a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.219204] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac516857-a0ce-4d93-b185-dc8ceaee9ea7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.234693] env[61439]: DEBUG nova.compute.provider_tree [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 843.236683] env[61439]: INFO nova.scheduler.client.report [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Deleted allocations for instance 7c3017dc-afd5-438e-ae23-ca0d7d4c01af [ 843.245130] env[61439]: DEBUG nova.scheduler.client.report [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 843.265266] env[61439]: DEBUG oslo_concurrency.lockutils [None req-34c11c68-0316-4863-b52d-c2ed6796f28d tempest-ServerDiskConfigTestJSON-1897992284 tempest-ServerDiskConfigTestJSON-1897992284-project-member] Lock "7c3017dc-afd5-438e-ae23-ca0d7d4c01af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.279s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 843.280451] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.356s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 843.281330] env[61439]: ERROR nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Traceback (most recent call last): [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self.driver.spawn(context, instance, image_meta, [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self._vmops.spawn(context, instance, image_meta, injected_files, [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] vm_ref = self.build_virtual_machine(instance, [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] vif_infos = vmwarevif.get_vif_info(self._session, [ 843.281330] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] for vif in network_info: [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] return self._sync_wrapper(fn, *args, **kwargs) [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self.wait() [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self[:] = self._gt.wait() [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] return self._exit_event.wait() [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] result = hub.switch() [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 843.281771] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] return self.greenlet.switch() [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] result = function(*args, **kwargs) [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] return func(*args, **kwargs) [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] raise e [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] nwinfo = self.network_api.allocate_for_instance( [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] created_port_ids = self._update_ports_for_instance( [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] with excutils.save_and_reraise_exception(): [ 843.282198] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] self.force_reraise() [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] raise self.value [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] updated_port = self._update_port( [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] _ensure_no_port_binding_failure(port) [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] raise exception.PortBindingFailed(port_id=port['id']) [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] nova.exception.PortBindingFailed: Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. [ 843.282657] env[61439]: ERROR nova.compute.manager [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] [ 843.282998] env[61439]: DEBUG nova.compute.utils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 843.284913] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Build of instance a7144e03-407c-436c-9e22-1eeaeb43a210 was re-scheduled: Binding failed for port 408f5613-8187-43fd-9e67-c27b5776065e, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 843.285597] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 843.285686] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 843.285802] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquired lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 843.285967] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 843.417071] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 844.256238] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 844.265376] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Releasing lock "refresh_cache-a7144e03-407c-436c-9e22-1eeaeb43a210" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 844.265755] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 844.265838] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 844.265951] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 844.352421] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 844.361664] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 844.372047] env[61439]: INFO nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: a7144e03-407c-436c-9e22-1eeaeb43a210] Took 0.11 seconds to deallocate network for instance. [ 844.504749] env[61439]: INFO nova.scheduler.client.report [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Deleted allocations for instance a7144e03-407c-436c-9e22-1eeaeb43a210 [ 844.535145] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "a7144e03-407c-436c-9e22-1eeaeb43a210" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.173s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 845.241099] env[61439]: DEBUG nova.compute.manager [req-95e51a12-2d61-423d-b916-3eebd5d41769 req-b058f041-42d8-4b41-9bc7-7ed704f33138 service nova] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Received event network-changed-fd9d5f30-56bf-48b3-ab8b-02cf36c2a603 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 845.241297] env[61439]: DEBUG nova.compute.manager [req-95e51a12-2d61-423d-b916-3eebd5d41769 req-b058f041-42d8-4b41-9bc7-7ed704f33138 service nova] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Refreshing instance network info cache due to event network-changed-fd9d5f30-56bf-48b3-ab8b-02cf36c2a603. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 845.241519] env[61439]: DEBUG oslo_concurrency.lockutils [req-95e51a12-2d61-423d-b916-3eebd5d41769 req-b058f041-42d8-4b41-9bc7-7ed704f33138 service nova] Acquiring lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 845.241663] env[61439]: DEBUG oslo_concurrency.lockutils [req-95e51a12-2d61-423d-b916-3eebd5d41769 req-b058f041-42d8-4b41-9bc7-7ed704f33138 service nova] Acquired lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 845.241851] env[61439]: DEBUG nova.network.neutron [req-95e51a12-2d61-423d-b916-3eebd5d41769 req-b058f041-42d8-4b41-9bc7-7ed704f33138 service nova] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Refreshing network info cache for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 845.277010] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "ce8c0523-8a11-4da8-af00-ee6b246ffac4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 845.277709] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "ce8c0523-8a11-4da8-af00-ee6b246ffac4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 845.289745] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 845.348862] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 845.349684] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 845.353159] env[61439]: INFO nova.compute.claims [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 845.385165] env[61439]: DEBUG nova.network.neutron [req-95e51a12-2d61-423d-b916-3eebd5d41769 req-b058f041-42d8-4b41-9bc7-7ed704f33138 service nova] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 845.574033] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc219401-00fb-4470-8dab-01232563617f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 845.582418] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fcf07ca-f22e-490a-9740-dfd2c5131af0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 845.619756] env[61439]: ERROR nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. [ 845.619756] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 845.619756] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 845.619756] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 845.619756] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 845.619756] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 845.619756] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 845.619756] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 845.619756] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 845.619756] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 845.619756] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 845.619756] env[61439]: ERROR nova.compute.manager raise self.value [ 845.619756] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 845.619756] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 845.619756] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 845.619756] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 845.620348] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 845.620348] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 845.620348] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. [ 845.620348] env[61439]: ERROR nova.compute.manager [ 845.620348] env[61439]: Traceback (most recent call last): [ 845.620348] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 845.620348] env[61439]: listener.cb(fileno) [ 845.620348] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 845.620348] env[61439]: result = function(*args, **kwargs) [ 845.620348] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 845.620348] env[61439]: return func(*args, **kwargs) [ 845.620348] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 845.620348] env[61439]: raise e [ 845.620348] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 845.620348] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 845.620348] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 845.620348] env[61439]: created_port_ids = self._update_ports_for_instance( [ 845.620348] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 845.620348] env[61439]: with excutils.save_and_reraise_exception(): [ 845.620348] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 845.620348] env[61439]: self.force_reraise() [ 845.620348] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 845.620348] env[61439]: raise self.value [ 845.620348] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 845.620348] env[61439]: updated_port = self._update_port( [ 845.620348] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 845.620348] env[61439]: _ensure_no_port_binding_failure(port) [ 845.620348] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 845.620348] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 845.621216] env[61439]: nova.exception.PortBindingFailed: Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. [ 845.621216] env[61439]: Removing descriptor: 26 [ 845.621216] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f9b8b38-3df2-4ac8-b8c8-d3ccceea8f32 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 845.624302] env[61439]: ERROR nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Traceback (most recent call last): [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] yield resources [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self.driver.spawn(context, instance, image_meta, [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] vm_ref = self.build_virtual_machine(instance, [ 845.624302] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] vif_infos = vmwarevif.get_vif_info(self._session, [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] for vif in network_info: [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] return self._sync_wrapper(fn, *args, **kwargs) [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self.wait() [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self[:] = self._gt.wait() [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] return self._exit_event.wait() [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 845.624718] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] result = hub.switch() [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] return self.greenlet.switch() [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] result = function(*args, **kwargs) [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] return func(*args, **kwargs) [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] raise e [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] nwinfo = self.network_api.allocate_for_instance( [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] created_port_ids = self._update_ports_for_instance( [ 845.625134] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] with excutils.save_and_reraise_exception(): [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self.force_reraise() [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] raise self.value [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] updated_port = self._update_port( [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] _ensure_no_port_binding_failure(port) [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] raise exception.PortBindingFailed(port_id=port['id']) [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] nova.exception.PortBindingFailed: Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. [ 845.625708] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] [ 845.626526] env[61439]: INFO nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Terminating instance [ 845.626950] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 845.631799] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f2b9d89-2575-4c87-85c8-028ae8a84f00 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 845.646852] env[61439]: DEBUG nova.compute.provider_tree [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 845.662434] env[61439]: DEBUG nova.scheduler.client.report [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 845.678837] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 845.680935] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 845.737689] env[61439]: DEBUG nova.compute.utils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 845.739011] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 845.739187] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 845.753550] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 845.836313] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 845.866179] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 845.866513] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 845.866594] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 845.866844] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 845.866919] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 845.867111] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 845.867327] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 845.867490] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 845.867657] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 845.867821] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 845.867994] env[61439]: DEBUG nova.virt.hardware [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 845.868857] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d451ef-a43f-4ef1-ac63-7cddca6ccab0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 845.877841] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89e1f573-3aab-4270-b2ad-147c5aa36488 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 846.105219] env[61439]: DEBUG nova.network.neutron [req-95e51a12-2d61-423d-b916-3eebd5d41769 req-b058f041-42d8-4b41-9bc7-7ed704f33138 service nova] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 846.118411] env[61439]: DEBUG oslo_concurrency.lockutils [req-95e51a12-2d61-423d-b916-3eebd5d41769 req-b058f041-42d8-4b41-9bc7-7ed704f33138 service nova] Releasing lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 846.118840] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquired lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 846.119039] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 846.177416] env[61439]: DEBUG nova.policy [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af2fd8431af45ca891f744f4d10b54f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca364a2df93a424f8b66ee39d9b0b120', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 846.181549] env[61439]: DEBUG nova.compute.manager [req-50f99f7e-e0b2-41e4-aa4e-59d1c1f92b8e req-cd8521ef-be93-4c7e-a084-7537434d4cb9 service nova] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Received event network-changed-4f5def34-c667-41db-ab3e-33cee4e6b03a {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 846.181744] env[61439]: DEBUG nova.compute.manager [req-50f99f7e-e0b2-41e4-aa4e-59d1c1f92b8e req-cd8521ef-be93-4c7e-a084-7537434d4cb9 service nova] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Refreshing instance network info cache due to event network-changed-4f5def34-c667-41db-ab3e-33cee4e6b03a. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 846.181961] env[61439]: DEBUG oslo_concurrency.lockutils [req-50f99f7e-e0b2-41e4-aa4e-59d1c1f92b8e req-cd8521ef-be93-4c7e-a084-7537434d4cb9 service nova] Acquiring lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 846.182123] env[61439]: DEBUG oslo_concurrency.lockutils [req-50f99f7e-e0b2-41e4-aa4e-59d1c1f92b8e req-cd8521ef-be93-4c7e-a084-7537434d4cb9 service nova] Acquired lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 846.182331] env[61439]: DEBUG nova.network.neutron [req-50f99f7e-e0b2-41e4-aa4e-59d1c1f92b8e req-cd8521ef-be93-4c7e-a084-7537434d4cb9 service nova] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Refreshing network info cache for port 4f5def34-c667-41db-ab3e-33cee4e6b03a {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 846.195638] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 846.346154] env[61439]: DEBUG nova.network.neutron [req-50f99f7e-e0b2-41e4-aa4e-59d1c1f92b8e req-cd8521ef-be93-4c7e-a084-7537434d4cb9 service nova] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 846.676481] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 846.687143] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Releasing lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 846.690050] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 846.690050] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 846.690050] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5d99fdaf-8d53-492a-adcd-69a0ac86fde2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 846.703414] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aebefbb7-2aaf-4ee0-bc0b-cc515748f762 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 846.730107] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2d8f035e-8a57-4937-a394-10d94a5630ad could not be found. [ 846.730107] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 846.730107] env[61439]: INFO nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Took 0.04 seconds to destroy the instance on the hypervisor. [ 846.730745] env[61439]: DEBUG oslo.service.loopingcall [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 846.731407] env[61439]: DEBUG nova.network.neutron [req-50f99f7e-e0b2-41e4-aa4e-59d1c1f92b8e req-cd8521ef-be93-4c7e-a084-7537434d4cb9 service nova] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 846.732588] env[61439]: DEBUG nova.compute.manager [-] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 846.732821] env[61439]: DEBUG nova.network.neutron [-] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 846.747256] env[61439]: DEBUG oslo_concurrency.lockutils [req-50f99f7e-e0b2-41e4-aa4e-59d1c1f92b8e req-cd8521ef-be93-4c7e-a084-7537434d4cb9 service nova] Releasing lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 846.771796] env[61439]: DEBUG nova.network.neutron [-] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 846.777785] env[61439]: ERROR nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. [ 846.777785] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 846.777785] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 846.777785] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 846.777785] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 846.777785] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 846.777785] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 846.777785] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 846.777785] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 846.777785] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 846.777785] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 846.777785] env[61439]: ERROR nova.compute.manager raise self.value [ 846.777785] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 846.777785] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 846.777785] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 846.777785] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 846.778359] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 846.778359] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 846.778359] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. [ 846.778359] env[61439]: ERROR nova.compute.manager [ 846.778359] env[61439]: Traceback (most recent call last): [ 846.778359] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 846.778359] env[61439]: listener.cb(fileno) [ 846.778359] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 846.778359] env[61439]: result = function(*args, **kwargs) [ 846.778359] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 846.778359] env[61439]: return func(*args, **kwargs) [ 846.778359] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 846.778359] env[61439]: raise e [ 846.778359] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 846.778359] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 846.778359] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 846.778359] env[61439]: created_port_ids = self._update_ports_for_instance( [ 846.778359] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 846.778359] env[61439]: with excutils.save_and_reraise_exception(): [ 846.778359] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 846.778359] env[61439]: self.force_reraise() [ 846.778359] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 846.778359] env[61439]: raise self.value [ 846.778359] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 846.778359] env[61439]: updated_port = self._update_port( [ 846.778359] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 846.778359] env[61439]: _ensure_no_port_binding_failure(port) [ 846.778359] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 846.778359] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 846.779315] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. [ 846.779315] env[61439]: Removing descriptor: 22 [ 846.779315] env[61439]: ERROR nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Traceback (most recent call last): [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] yield resources [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self.driver.spawn(context, instance, image_meta, [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 846.779315] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] vm_ref = self.build_virtual_machine(instance, [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] vif_infos = vmwarevif.get_vif_info(self._session, [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] for vif in network_info: [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] return self._sync_wrapper(fn, *args, **kwargs) [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self.wait() [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self[:] = self._gt.wait() [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] return self._exit_event.wait() [ 846.779741] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] result = hub.switch() [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] return self.greenlet.switch() [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] result = function(*args, **kwargs) [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] return func(*args, **kwargs) [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] raise e [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] nwinfo = self.network_api.allocate_for_instance( [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 846.780206] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] created_port_ids = self._update_ports_for_instance( [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] with excutils.save_and_reraise_exception(): [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self.force_reraise() [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] raise self.value [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] updated_port = self._update_port( [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] _ensure_no_port_binding_failure(port) [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 846.780769] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] raise exception.PortBindingFailed(port_id=port['id']) [ 846.781900] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] nova.exception.PortBindingFailed: Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. [ 846.781900] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] [ 846.781900] env[61439]: INFO nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Terminating instance [ 846.782069] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 846.782069] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquired lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 846.782468] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 846.788639] env[61439]: DEBUG nova.network.neutron [-] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 846.798699] env[61439]: INFO nova.compute.manager [-] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Took 0.07 seconds to deallocate network for instance. [ 846.807684] env[61439]: DEBUG nova.compute.claims [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 846.807684] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 846.807684] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 846.824131] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 847.047846] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68746cf0-01a5-47ae-8964-b85401926d6b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.058517] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31eea88f-6703-4151-923a-50eb8aafa7e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.094436] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bd9895d-44d6-4f0a-b1fa-81be78f02f7c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.102166] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da308a1d-e932-4bbd-b352-b7b2243a9f04 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.118092] env[61439]: DEBUG nova.compute.provider_tree [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 847.127364] env[61439]: DEBUG nova.scheduler.client.report [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 847.146304] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.340s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 847.146372] env[61439]: ERROR nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Traceback (most recent call last): [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self.driver.spawn(context, instance, image_meta, [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] vm_ref = self.build_virtual_machine(instance, [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] vif_infos = vmwarevif.get_vif_info(self._session, [ 847.146372] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] for vif in network_info: [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] return self._sync_wrapper(fn, *args, **kwargs) [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self.wait() [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self[:] = self._gt.wait() [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] return self._exit_event.wait() [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] result = hub.switch() [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 847.146735] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] return self.greenlet.switch() [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] result = function(*args, **kwargs) [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] return func(*args, **kwargs) [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] raise e [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] nwinfo = self.network_api.allocate_for_instance( [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] created_port_ids = self._update_ports_for_instance( [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] with excutils.save_and_reraise_exception(): [ 847.147122] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] self.force_reraise() [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] raise self.value [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] updated_port = self._update_port( [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] _ensure_no_port_binding_failure(port) [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] raise exception.PortBindingFailed(port_id=port['id']) [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] nova.exception.PortBindingFailed: Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. [ 847.147564] env[61439]: ERROR nova.compute.manager [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] [ 847.147998] env[61439]: DEBUG nova.compute.utils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 847.148732] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Build of instance 2d8f035e-8a57-4937-a394-10d94a5630ad was re-scheduled: Binding failed for port fd9d5f30-56bf-48b3-ab8b-02cf36c2a603, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 847.149162] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 847.149382] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquiring lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 847.149525] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Acquired lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 847.149680] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 847.223012] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 847.523532] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 847.537159] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Releasing lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 847.537567] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 847.537763] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 847.538320] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-723d9b4e-ab7b-46d4-ab60-b1c1755e0064 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.550756] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15d5166a-86c3-4e2a-a671-70e04ae1d8c1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.576811] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a0be018c-59be-4eb0-a3aa-b7814ce240e8 could not be found. [ 847.576993] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 847.576993] env[61439]: INFO nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 847.577452] env[61439]: DEBUG oslo.service.loopingcall [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 847.577711] env[61439]: DEBUG nova.compute.manager [-] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 847.578136] env[61439]: DEBUG nova.network.neutron [-] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 847.647169] env[61439]: DEBUG nova.compute.manager [req-0cc8de21-7b15-4309-88b5-a3323ae793a7 req-b9d83098-5a39-42ec-93ae-4bd08bb93ec0 service nova] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Received event network-changed-08566279-b49f-411a-97b7-79b249e6fbac {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 847.647169] env[61439]: DEBUG nova.compute.manager [req-0cc8de21-7b15-4309-88b5-a3323ae793a7 req-b9d83098-5a39-42ec-93ae-4bd08bb93ec0 service nova] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Refreshing instance network info cache due to event network-changed-08566279-b49f-411a-97b7-79b249e6fbac. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 847.647169] env[61439]: DEBUG oslo_concurrency.lockutils [req-0cc8de21-7b15-4309-88b5-a3323ae793a7 req-b9d83098-5a39-42ec-93ae-4bd08bb93ec0 service nova] Acquiring lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 847.647169] env[61439]: DEBUG oslo_concurrency.lockutils [req-0cc8de21-7b15-4309-88b5-a3323ae793a7 req-b9d83098-5a39-42ec-93ae-4bd08bb93ec0 service nova] Acquired lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 847.647658] env[61439]: DEBUG nova.network.neutron [req-0cc8de21-7b15-4309-88b5-a3323ae793a7 req-b9d83098-5a39-42ec-93ae-4bd08bb93ec0 service nova] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Refreshing network info cache for port 08566279-b49f-411a-97b7-79b249e6fbac {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 847.665587] env[61439]: ERROR nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. [ 847.665587] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 847.665587] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 847.665587] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 847.665587] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 847.665587] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 847.665587] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 847.665587] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 847.665587] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 847.665587] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 847.665587] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 847.665587] env[61439]: ERROR nova.compute.manager raise self.value [ 847.665587] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 847.665587] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 847.665587] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 847.665587] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 847.666113] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 847.666113] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 847.666113] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. [ 847.666113] env[61439]: ERROR nova.compute.manager [ 847.668018] env[61439]: Traceback (most recent call last): [ 847.668018] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 847.668018] env[61439]: listener.cb(fileno) [ 847.668018] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 847.668018] env[61439]: result = function(*args, **kwargs) [ 847.668018] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 847.668018] env[61439]: return func(*args, **kwargs) [ 847.668018] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 847.668018] env[61439]: raise e [ 847.668018] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 847.668018] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 847.668018] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 847.668018] env[61439]: created_port_ids = self._update_ports_for_instance( [ 847.668018] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 847.668018] env[61439]: with excutils.save_and_reraise_exception(): [ 847.668018] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 847.668018] env[61439]: self.force_reraise() [ 847.668018] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 847.668018] env[61439]: raise self.value [ 847.668018] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 847.668018] env[61439]: updated_port = self._update_port( [ 847.668018] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 847.668018] env[61439]: _ensure_no_port_binding_failure(port) [ 847.668018] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 847.668018] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 847.668018] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. [ 847.668018] env[61439]: Removing descriptor: 21 [ 847.668927] env[61439]: ERROR nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Traceback (most recent call last): [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] yield resources [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self.driver.spawn(context, instance, image_meta, [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] vm_ref = self.build_virtual_machine(instance, [ 847.668927] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] vif_infos = vmwarevif.get_vif_info(self._session, [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] for vif in network_info: [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] return self._sync_wrapper(fn, *args, **kwargs) [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self.wait() [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self[:] = self._gt.wait() [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] return self._exit_event.wait() [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 847.669304] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] result = hub.switch() [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] return self.greenlet.switch() [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] result = function(*args, **kwargs) [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] return func(*args, **kwargs) [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] raise e [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] nwinfo = self.network_api.allocate_for_instance( [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] created_port_ids = self._update_ports_for_instance( [ 847.669734] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] with excutils.save_and_reraise_exception(): [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self.force_reraise() [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] raise self.value [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] updated_port = self._update_port( [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] _ensure_no_port_binding_failure(port) [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] raise exception.PortBindingFailed(port_id=port['id']) [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] nova.exception.PortBindingFailed: Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. [ 847.670146] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] [ 847.670592] env[61439]: INFO nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Terminating instance [ 847.674167] env[61439]: DEBUG nova.network.neutron [-] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 847.674413] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 847.685651] env[61439]: DEBUG nova.network.neutron [-] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 847.697335] env[61439]: INFO nova.compute.manager [-] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Took 0.12 seconds to deallocate network for instance. [ 847.700054] env[61439]: DEBUG nova.compute.claims [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 847.700054] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 847.700226] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 847.778877] env[61439]: DEBUG nova.network.neutron [req-0cc8de21-7b15-4309-88b5-a3323ae793a7 req-b9d83098-5a39-42ec-93ae-4bd08bb93ec0 service nova] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 847.849048] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 847.858776] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Releasing lock "refresh_cache-2d8f035e-8a57-4937-a394-10d94a5630ad" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 847.859039] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 847.859233] env[61439]: DEBUG nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 847.859397] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 847.881212] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9ccde80-5133-45a7-950d-2dab8fc8d918 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.889867] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f86d2be-0a3e-49c6-9251-05671278da34 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.934266] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef54776c-f51c-433e-9a56-8e8c3ef2641b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.940691] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d312c5b7-ed01-41cb-9598-9f542098385b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.954949] env[61439]: DEBUG nova.compute.provider_tree [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 847.956770] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 847.964009] env[61439]: DEBUG nova.scheduler.client.report [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 847.967611] env[61439]: DEBUG nova.network.neutron [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 847.976124] env[61439]: INFO nova.compute.manager [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] [instance: 2d8f035e-8a57-4937-a394-10d94a5630ad] Took 0.12 seconds to deallocate network for instance. [ 847.979792] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.279s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 847.980381] env[61439]: ERROR nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Traceback (most recent call last): [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self.driver.spawn(context, instance, image_meta, [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] vm_ref = self.build_virtual_machine(instance, [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] vif_infos = vmwarevif.get_vif_info(self._session, [ 847.980381] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] for vif in network_info: [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] return self._sync_wrapper(fn, *args, **kwargs) [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self.wait() [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self[:] = self._gt.wait() [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] return self._exit_event.wait() [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] result = hub.switch() [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 847.980756] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] return self.greenlet.switch() [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] result = function(*args, **kwargs) [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] return func(*args, **kwargs) [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] raise e [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] nwinfo = self.network_api.allocate_for_instance( [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] created_port_ids = self._update_ports_for_instance( [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] with excutils.save_and_reraise_exception(): [ 847.981155] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] self.force_reraise() [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] raise self.value [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] updated_port = self._update_port( [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] _ensure_no_port_binding_failure(port) [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] raise exception.PortBindingFailed(port_id=port['id']) [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] nova.exception.PortBindingFailed: Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. [ 847.981521] env[61439]: ERROR nova.compute.manager [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] [ 847.981838] env[61439]: DEBUG nova.compute.utils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 847.982948] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Build of instance a0be018c-59be-4eb0-a3aa-b7814ce240e8 was re-scheduled: Binding failed for port 4f5def34-c667-41db-ab3e-33cee4e6b03a, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 847.983226] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 847.983497] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 847.983685] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquired lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 847.983882] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 848.062377] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 848.106112] env[61439]: INFO nova.scheduler.client.report [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Deleted allocations for instance 2d8f035e-8a57-4937-a394-10d94a5630ad [ 848.138751] env[61439]: DEBUG oslo_concurrency.lockutils [None req-69e21ef1-941c-4fe7-8bac-d48ac1fcda70 tempest-MultipleCreateTestJSON-759979208 tempest-MultipleCreateTestJSON-759979208-project-member] Lock "2d8f035e-8a57-4937-a394-10d94a5630ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.747s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 848.429868] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Successfully created port: c5a50d49-6f28-4037-a334-593256394f2c {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 848.566943] env[61439]: DEBUG nova.network.neutron [req-0cc8de21-7b15-4309-88b5-a3323ae793a7 req-b9d83098-5a39-42ec-93ae-4bd08bb93ec0 service nova] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.593641] env[61439]: DEBUG oslo_concurrency.lockutils [req-0cc8de21-7b15-4309-88b5-a3323ae793a7 req-b9d83098-5a39-42ec-93ae-4bd08bb93ec0 service nova] Releasing lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 848.594205] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquired lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 848.594525] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 848.659097] env[61439]: DEBUG nova.compute.manager [req-f583b8c2-44be-4aef-8b5f-560bd64c0821 req-bafb01d3-fd90-45fe-92be-3056f4f61969 service nova] [instance: 6288806c-d634-4749-8538-7188954788f0] Received event network-changed-da804bde-daba-479b-ab5f-0742b2a78cec {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 848.659309] env[61439]: DEBUG nova.compute.manager [req-f583b8c2-44be-4aef-8b5f-560bd64c0821 req-bafb01d3-fd90-45fe-92be-3056f4f61969 service nova] [instance: 6288806c-d634-4749-8538-7188954788f0] Refreshing instance network info cache due to event network-changed-da804bde-daba-479b-ab5f-0742b2a78cec. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 848.663116] env[61439]: DEBUG oslo_concurrency.lockutils [req-f583b8c2-44be-4aef-8b5f-560bd64c0821 req-bafb01d3-fd90-45fe-92be-3056f4f61969 service nova] Acquiring lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 848.664816] env[61439]: DEBUG oslo_concurrency.lockutils [req-f583b8c2-44be-4aef-8b5f-560bd64c0821 req-bafb01d3-fd90-45fe-92be-3056f4f61969 service nova] Acquired lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 848.664816] env[61439]: DEBUG nova.network.neutron [req-f583b8c2-44be-4aef-8b5f-560bd64c0821 req-bafb01d3-fd90-45fe-92be-3056f4f61969 service nova] [instance: 6288806c-d634-4749-8538-7188954788f0] Refreshing network info cache for port da804bde-daba-479b-ab5f-0742b2a78cec {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 848.666328] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 848.672163] env[61439]: ERROR nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. [ 848.672163] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 848.672163] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 848.672163] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 848.672163] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 848.672163] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 848.672163] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 848.672163] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 848.672163] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 848.672163] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 848.672163] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 848.672163] env[61439]: ERROR nova.compute.manager raise self.value [ 848.672163] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 848.672163] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 848.672163] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 848.672163] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 848.672739] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 848.672739] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 848.672739] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. [ 848.672739] env[61439]: ERROR nova.compute.manager [ 848.672739] env[61439]: Traceback (most recent call last): [ 848.672739] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 848.672739] env[61439]: listener.cb(fileno) [ 848.672739] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 848.672739] env[61439]: result = function(*args, **kwargs) [ 848.672739] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 848.672739] env[61439]: return func(*args, **kwargs) [ 848.672739] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 848.672739] env[61439]: raise e [ 848.672739] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 848.672739] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 848.672739] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 848.672739] env[61439]: created_port_ids = self._update_ports_for_instance( [ 848.672739] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 848.672739] env[61439]: with excutils.save_and_reraise_exception(): [ 848.672739] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 848.672739] env[61439]: self.force_reraise() [ 848.672739] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 848.672739] env[61439]: raise self.value [ 848.672739] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 848.672739] env[61439]: updated_port = self._update_port( [ 848.672739] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 848.672739] env[61439]: _ensure_no_port_binding_failure(port) [ 848.672739] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 848.672739] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 848.673662] env[61439]: nova.exception.PortBindingFailed: Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. [ 848.673662] env[61439]: Removing descriptor: 18 [ 848.678025] env[61439]: ERROR nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] Traceback (most recent call last): [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] yield resources [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self.driver.spawn(context, instance, image_meta, [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] vm_ref = self.build_virtual_machine(instance, [ 848.678025] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] vif_infos = vmwarevif.get_vif_info(self._session, [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] for vif in network_info: [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] return self._sync_wrapper(fn, *args, **kwargs) [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self.wait() [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self[:] = self._gt.wait() [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] return self._exit_event.wait() [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 848.678547] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] result = hub.switch() [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] return self.greenlet.switch() [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] result = function(*args, **kwargs) [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] return func(*args, **kwargs) [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] raise e [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] nwinfo = self.network_api.allocate_for_instance( [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] created_port_ids = self._update_ports_for_instance( [ 848.678956] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] with excutils.save_and_reraise_exception(): [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self.force_reraise() [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] raise self.value [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] updated_port = self._update_port( [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] _ensure_no_port_binding_failure(port) [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] raise exception.PortBindingFailed(port_id=port['id']) [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] nova.exception.PortBindingFailed: Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. [ 848.679395] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] [ 848.679888] env[61439]: INFO nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Terminating instance [ 848.679888] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 848.685754] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.705325] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Releasing lock "refresh_cache-a0be018c-59be-4eb0-a3aa-b7814ce240e8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 848.705592] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 848.705778] env[61439]: DEBUG nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 848.705957] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 848.728834] env[61439]: DEBUG nova.network.neutron [req-f583b8c2-44be-4aef-8b5f-560bd64c0821 req-bafb01d3-fd90-45fe-92be-3056f4f61969 service nova] [instance: 6288806c-d634-4749-8538-7188954788f0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 848.780067] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 848.792049] env[61439]: DEBUG nova.network.neutron [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.811946] env[61439]: INFO nova.compute.manager [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: a0be018c-59be-4eb0-a3aa-b7814ce240e8] Took 0.11 seconds to deallocate network for instance. [ 849.004549] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.007048] env[61439]: INFO nova.scheduler.client.report [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Deleted allocations for instance a0be018c-59be-4eb0-a3aa-b7814ce240e8 [ 849.024321] env[61439]: DEBUG nova.network.neutron [req-f583b8c2-44be-4aef-8b5f-560bd64c0821 req-bafb01d3-fd90-45fe-92be-3056f4f61969 service nova] [instance: 6288806c-d634-4749-8538-7188954788f0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.027863] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Releasing lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 849.027863] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 849.027863] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 849.027863] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1b4d90cb-8c8f-421b-a461-09399c7c3eec {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.035094] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bed1e0ff-88ef-497e-bbaf-2053a82c0ba6 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "a0be018c-59be-4eb0-a3aa-b7814ce240e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.015s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.044482] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a5e1134-05c7-4186-9e32-12852ca2a538 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.062324] env[61439]: DEBUG oslo_concurrency.lockutils [req-f583b8c2-44be-4aef-8b5f-560bd64c0821 req-bafb01d3-fd90-45fe-92be-3056f4f61969 service nova] Releasing lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 849.062324] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquired lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 849.062324] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 849.074205] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 95a16efa-3240-4e15-ae19-03aaef61e2de could not be found. [ 849.074453] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 849.074653] env[61439]: INFO nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Took 0.05 seconds to destroy the instance on the hypervisor. [ 849.074918] env[61439]: DEBUG oslo.service.loopingcall [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 849.075531] env[61439]: DEBUG nova.compute.manager [-] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 849.076191] env[61439]: DEBUG nova.network.neutron [-] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 849.123031] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 849.144017] env[61439]: DEBUG nova.network.neutron [-] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 849.154030] env[61439]: DEBUG nova.network.neutron [-] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.167498] env[61439]: INFO nova.compute.manager [-] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Took 0.09 seconds to deallocate network for instance. [ 849.172837] env[61439]: DEBUG nova.compute.claims [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 849.172837] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.172837] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.335358] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d3ae3fe-08e0-4bd3-a06f-0aac6c9519e4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.346261] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-137a7867-e4d1-457c-9f99-dbb66d5005a0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.382635] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.383187] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-794b2a65-ecd7-4b30-83da-774e9311864f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.392290] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19469352-2602-4d66-9a5c-41e79f60f4a0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.396765] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Releasing lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 849.397180] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 849.397380] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 849.398119] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c97466c5-4606-4a1e-8e37-2cc57a9f71fd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.410992] env[61439]: DEBUG nova.compute.provider_tree [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 849.416264] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ff814b1-296e-48a4-924b-c4006708234c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.428357] env[61439]: DEBUG nova.scheduler.client.report [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 849.444055] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6288806c-d634-4749-8538-7188954788f0 could not be found. [ 849.444191] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 849.444299] env[61439]: INFO nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Took 0.05 seconds to destroy the instance on the hypervisor. [ 849.444549] env[61439]: DEBUG oslo.service.loopingcall [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 849.445318] env[61439]: DEBUG nova.compute.manager [-] [instance: 6288806c-d634-4749-8538-7188954788f0] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 849.445430] env[61439]: DEBUG nova.network.neutron [-] [instance: 6288806c-d634-4749-8538-7188954788f0] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 849.447699] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.276s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.448181] env[61439]: ERROR nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Traceback (most recent call last): [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self.driver.spawn(context, instance, image_meta, [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] vm_ref = self.build_virtual_machine(instance, [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] vif_infos = vmwarevif.get_vif_info(self._session, [ 849.448181] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] for vif in network_info: [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] return self._sync_wrapper(fn, *args, **kwargs) [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self.wait() [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self[:] = self._gt.wait() [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] return self._exit_event.wait() [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] result = hub.switch() [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 849.448713] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] return self.greenlet.switch() [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] result = function(*args, **kwargs) [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] return func(*args, **kwargs) [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] raise e [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] nwinfo = self.network_api.allocate_for_instance( [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] created_port_ids = self._update_ports_for_instance( [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] with excutils.save_and_reraise_exception(): [ 849.449121] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] self.force_reraise() [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] raise self.value [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] updated_port = self._update_port( [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] _ensure_no_port_binding_failure(port) [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] raise exception.PortBindingFailed(port_id=port['id']) [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] nova.exception.PortBindingFailed: Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. [ 849.449683] env[61439]: ERROR nova.compute.manager [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] [ 849.450049] env[61439]: DEBUG nova.compute.utils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 849.451169] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Build of instance 95a16efa-3240-4e15-ae19-03aaef61e2de was re-scheduled: Binding failed for port 08566279-b49f-411a-97b7-79b249e6fbac, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 849.451169] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 849.451169] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquiring lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 849.451169] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Acquired lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 849.451432] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 849.502761] env[61439]: DEBUG nova.network.neutron [-] [instance: 6288806c-d634-4749-8538-7188954788f0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 849.504721] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 849.514700] env[61439]: DEBUG nova.network.neutron [-] [instance: 6288806c-d634-4749-8538-7188954788f0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.527760] env[61439]: INFO nova.compute.manager [-] [instance: 6288806c-d634-4749-8538-7188954788f0] Took 0.08 seconds to deallocate network for instance. [ 849.530140] env[61439]: DEBUG nova.compute.claims [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 849.530318] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.530537] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.683037] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4c9f0db-f6e7-4f7c-983d-ebac405e4820 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.692679] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-711ad9ec-d68e-49ec-baf3-5d95850293d5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.724764] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ab8c878-be90-479d-90bd-745a82b36f31 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.732700] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-825fdf0f-5c39-4ac6-bb39-ef5beac2d693 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.746298] env[61439]: DEBUG nova.compute.provider_tree [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 849.760291] env[61439]: DEBUG nova.scheduler.client.report [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 849.784871] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.254s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.785420] env[61439]: ERROR nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] Traceback (most recent call last): [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self.driver.spawn(context, instance, image_meta, [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] vm_ref = self.build_virtual_machine(instance, [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] vif_infos = vmwarevif.get_vif_info(self._session, [ 849.785420] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] for vif in network_info: [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] return self._sync_wrapper(fn, *args, **kwargs) [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self.wait() [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self[:] = self._gt.wait() [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] return self._exit_event.wait() [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] result = hub.switch() [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 849.785805] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] return self.greenlet.switch() [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] result = function(*args, **kwargs) [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] return func(*args, **kwargs) [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] raise e [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] nwinfo = self.network_api.allocate_for_instance( [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] created_port_ids = self._update_ports_for_instance( [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] with excutils.save_and_reraise_exception(): [ 849.786246] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] self.force_reraise() [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] raise self.value [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] updated_port = self._update_port( [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] _ensure_no_port_binding_failure(port) [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] raise exception.PortBindingFailed(port_id=port['id']) [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] nova.exception.PortBindingFailed: Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. [ 849.786646] env[61439]: ERROR nova.compute.manager [instance: 6288806c-d634-4749-8538-7188954788f0] [ 849.786986] env[61439]: DEBUG nova.compute.utils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 849.787768] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Build of instance 6288806c-d634-4749-8538-7188954788f0 was re-scheduled: Binding failed for port da804bde-daba-479b-ab5f-0742b2a78cec, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 849.788197] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 849.788436] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquiring lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 849.788602] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Acquired lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 849.788768] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 849.883167] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 850.148015] env[61439]: DEBUG nova.compute.manager [req-ccc74b44-503d-4c6e-ad4f-cc7a1558adfb req-7ad55bdf-2f7e-4f3c-889c-c2fb9f140732 service nova] [instance: 53400571-766c-4020-b163-87a8816199cd] Received event network-changed-42f49d5f-4672-4e9e-9e10-a5ab004efa63 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 850.148504] env[61439]: DEBUG nova.compute.manager [req-ccc74b44-503d-4c6e-ad4f-cc7a1558adfb req-7ad55bdf-2f7e-4f3c-889c-c2fb9f140732 service nova] [instance: 53400571-766c-4020-b163-87a8816199cd] Refreshing instance network info cache due to event network-changed-42f49d5f-4672-4e9e-9e10-a5ab004efa63. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 850.149582] env[61439]: DEBUG oslo_concurrency.lockutils [req-ccc74b44-503d-4c6e-ad4f-cc7a1558adfb req-7ad55bdf-2f7e-4f3c-889c-c2fb9f140732 service nova] Acquiring lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 850.149582] env[61439]: DEBUG oslo_concurrency.lockutils [req-ccc74b44-503d-4c6e-ad4f-cc7a1558adfb req-7ad55bdf-2f7e-4f3c-889c-c2fb9f140732 service nova] Acquired lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 850.151402] env[61439]: DEBUG nova.network.neutron [req-ccc74b44-503d-4c6e-ad4f-cc7a1558adfb req-7ad55bdf-2f7e-4f3c-889c-c2fb9f140732 service nova] [instance: 53400571-766c-4020-b163-87a8816199cd] Refreshing network info cache for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 850.193829] env[61439]: ERROR nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. [ 850.193829] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 850.193829] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 850.193829] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 850.193829] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 850.193829] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 850.193829] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 850.193829] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 850.193829] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 850.193829] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 850.193829] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 850.193829] env[61439]: ERROR nova.compute.manager raise self.value [ 850.193829] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 850.193829] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 850.193829] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 850.193829] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 850.194350] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 850.194350] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 850.194350] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. [ 850.194350] env[61439]: ERROR nova.compute.manager [ 850.194350] env[61439]: Traceback (most recent call last): [ 850.194350] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 850.194350] env[61439]: listener.cb(fileno) [ 850.194350] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 850.194350] env[61439]: result = function(*args, **kwargs) [ 850.194350] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 850.194350] env[61439]: return func(*args, **kwargs) [ 850.194350] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 850.194350] env[61439]: raise e [ 850.194350] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 850.194350] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 850.194350] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 850.194350] env[61439]: created_port_ids = self._update_ports_for_instance( [ 850.194350] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 850.194350] env[61439]: with excutils.save_and_reraise_exception(): [ 850.194350] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 850.194350] env[61439]: self.force_reraise() [ 850.194350] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 850.194350] env[61439]: raise self.value [ 850.194350] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 850.194350] env[61439]: updated_port = self._update_port( [ 850.194350] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 850.194350] env[61439]: _ensure_no_port_binding_failure(port) [ 850.194350] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 850.194350] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 850.195657] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. [ 850.195657] env[61439]: Removing descriptor: 20 [ 850.195657] env[61439]: ERROR nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] Traceback (most recent call last): [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] yield resources [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self.driver.spawn(context, instance, image_meta, [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 850.195657] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] vm_ref = self.build_virtual_machine(instance, [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] vif_infos = vmwarevif.get_vif_info(self._session, [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] for vif in network_info: [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] return self._sync_wrapper(fn, *args, **kwargs) [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self.wait() [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self[:] = self._gt.wait() [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] return self._exit_event.wait() [ 850.196038] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] result = hub.switch() [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] return self.greenlet.switch() [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] result = function(*args, **kwargs) [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] return func(*args, **kwargs) [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] raise e [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] nwinfo = self.network_api.allocate_for_instance( [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 850.196488] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] created_port_ids = self._update_ports_for_instance( [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] with excutils.save_and_reraise_exception(): [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self.force_reraise() [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] raise self.value [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] updated_port = self._update_port( [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] _ensure_no_port_binding_failure(port) [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 850.196899] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] raise exception.PortBindingFailed(port_id=port['id']) [ 850.197267] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] nova.exception.PortBindingFailed: Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. [ 850.197267] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] [ 850.197267] env[61439]: INFO nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Terminating instance [ 850.200822] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.201706] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 850.214617] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Releasing lock "refresh_cache-95a16efa-3240-4e15-ae19-03aaef61e2de" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 850.214867] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 850.215249] env[61439]: DEBUG nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 850.215249] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 850.301428] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 850.313171] env[61439]: DEBUG nova.network.neutron [req-ccc74b44-503d-4c6e-ad4f-cc7a1558adfb req-7ad55bdf-2f7e-4f3c-889c-c2fb9f140732 service nova] [instance: 53400571-766c-4020-b163-87a8816199cd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 850.316182] env[61439]: DEBUG nova.network.neutron [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.330625] env[61439]: INFO nova.compute.manager [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] [instance: 95a16efa-3240-4e15-ae19-03aaef61e2de] Took 0.12 seconds to deallocate network for instance. [ 850.355548] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.368250] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Releasing lock "refresh_cache-6288806c-d634-4749-8538-7188954788f0" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 850.368600] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 850.368742] env[61439]: DEBUG nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 850.368984] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 850.441136] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 850.441927] env[61439]: INFO nova.scheduler.client.report [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Deleted allocations for instance 95a16efa-3240-4e15-ae19-03aaef61e2de [ 850.452596] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "c9574f78-b316-467b-acbe-2122886c2990" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.452902] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "c9574f78-b316-467b-acbe-2122886c2990" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 850.453973] env[61439]: DEBUG nova.network.neutron [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.464220] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d66a3643-738f-4c2d-a23a-245c2483597c tempest-ImagesTestJSON-1787928631 tempest-ImagesTestJSON-1787928631-project-member] Lock "95a16efa-3240-4e15-ae19-03aaef61e2de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.553s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 850.468943] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 850.475459] env[61439]: INFO nova.compute.manager [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] [instance: 6288806c-d634-4749-8538-7188954788f0] Took 0.11 seconds to deallocate network for instance. [ 850.542120] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.542475] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 850.546916] env[61439]: INFO nova.compute.claims [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 850.611225] env[61439]: INFO nova.scheduler.client.report [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Deleted allocations for instance 6288806c-d634-4749-8538-7188954788f0 [ 850.710459] env[61439]: DEBUG oslo_concurrency.lockutils [None req-bcac3c5c-f5ed-46d1-9cae-1cb4ad1bc677 tempest-ServersAdminTestJSON-620780778 tempest-ServersAdminTestJSON-620780778-project-member] Lock "6288806c-d634-4749-8538-7188954788f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.533s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 850.738026] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ac6f96c-56e8-4e8c-843d-80070513b7c6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.745252] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeef8085-e088-4e52-82ab-fd0ff3b6f7b3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.776624] env[61439]: DEBUG nova.network.neutron [req-ccc74b44-503d-4c6e-ad4f-cc7a1558adfb req-7ad55bdf-2f7e-4f3c-889c-c2fb9f140732 service nova] [instance: 53400571-766c-4020-b163-87a8816199cd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.778589] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-609215ab-6990-4f21-a993-5144f6599757 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.787367] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0f4a57e-6f86-46ee-ab25-3e39ae3027cc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.791745] env[61439]: DEBUG oslo_concurrency.lockutils [req-ccc74b44-503d-4c6e-ad4f-cc7a1558adfb req-7ad55bdf-2f7e-4f3c-889c-c2fb9f140732 service nova] Releasing lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 850.792422] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquired lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 850.792616] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 850.806703] env[61439]: DEBUG nova.compute.provider_tree [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 850.815502] env[61439]: DEBUG nova.scheduler.client.report [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 850.835082] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 850.835287] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 851.612665] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 851.615844] env[61439]: DEBUG nova.compute.utils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 851.617696] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 851.617696] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 851.634866] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 851.729641] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 851.761779] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 851.762053] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 851.762215] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 851.762417] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 851.762570] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 851.762715] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 851.762923] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 851.763090] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 851.763257] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 851.763421] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 851.763590] env[61439]: DEBUG nova.virt.hardware [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 851.764501] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01f2bdac-cbb0-4c01-916c-74f12b40c552 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.772920] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30de2c33-329a-4b67-9350-0063e48cc65b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.866198] env[61439]: DEBUG nova.policy [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '721ff5c63c5f405bb7be8c486b5fd162', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd841db9575854aa388acc0bbb499fd52', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 852.316360] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 852.325720] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Releasing lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 852.326139] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 852.326334] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 852.326858] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f2414467-f7ae-4925-bbd6-8dc9b4d31a2f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.339020] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf79cffd-d0ef-469b-a077-3bb9cf83da02 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.358775] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 53400571-766c-4020-b163-87a8816199cd could not be found. [ 852.359009] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 852.359198] env[61439]: INFO nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Took 0.03 seconds to destroy the instance on the hypervisor. [ 852.359552] env[61439]: DEBUG oslo.service.loopingcall [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 852.359635] env[61439]: DEBUG nova.compute.manager [-] [instance: 53400571-766c-4020-b163-87a8816199cd] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 852.359732] env[61439]: DEBUG nova.network.neutron [-] [instance: 53400571-766c-4020-b163-87a8816199cd] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 852.437046] env[61439]: DEBUG nova.compute.manager [req-22b81385-2208-40da-a4ca-f0b8c76849d8 req-03e6fe3c-d606-42d8-8974-a2def93fcab0 service nova] [instance: 53400571-766c-4020-b163-87a8816199cd] Received event network-vif-deleted-42f49d5f-4672-4e9e-9e10-a5ab004efa63 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 852.672970] env[61439]: DEBUG nova.network.neutron [-] [instance: 53400571-766c-4020-b163-87a8816199cd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 853.845399] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Successfully created port: 03976688-9968-4c5b-b4cf-f1d6882dd680 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 854.318940] env[61439]: ERROR nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. [ 854.318940] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 854.318940] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 854.318940] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 854.318940] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 854.318940] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 854.318940] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 854.318940] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 854.318940] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 854.318940] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 854.318940] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 854.318940] env[61439]: ERROR nova.compute.manager raise self.value [ 854.318940] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 854.318940] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 854.318940] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 854.318940] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 854.319443] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 854.319443] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 854.319443] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. [ 854.319443] env[61439]: ERROR nova.compute.manager [ 854.319443] env[61439]: Traceback (most recent call last): [ 854.319443] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 854.319443] env[61439]: listener.cb(fileno) [ 854.319443] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 854.319443] env[61439]: result = function(*args, **kwargs) [ 854.319443] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 854.319443] env[61439]: return func(*args, **kwargs) [ 854.319443] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 854.319443] env[61439]: raise e [ 854.319443] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 854.319443] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 854.319443] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 854.319443] env[61439]: created_port_ids = self._update_ports_for_instance( [ 854.319443] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 854.319443] env[61439]: with excutils.save_and_reraise_exception(): [ 854.319443] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 854.319443] env[61439]: self.force_reraise() [ 854.319443] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 854.319443] env[61439]: raise self.value [ 854.319443] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 854.319443] env[61439]: updated_port = self._update_port( [ 854.319443] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 854.319443] env[61439]: _ensure_no_port_binding_failure(port) [ 854.319443] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 854.319443] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 854.320406] env[61439]: nova.exception.PortBindingFailed: Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. [ 854.320406] env[61439]: Removing descriptor: 26 [ 854.320406] env[61439]: ERROR nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Traceback (most recent call last): [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] yield resources [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self.driver.spawn(context, instance, image_meta, [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 854.320406] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] vm_ref = self.build_virtual_machine(instance, [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] vif_infos = vmwarevif.get_vif_info(self._session, [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] for vif in network_info: [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] return self._sync_wrapper(fn, *args, **kwargs) [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self.wait() [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self[:] = self._gt.wait() [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] return self._exit_event.wait() [ 854.320770] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] result = hub.switch() [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] return self.greenlet.switch() [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] result = function(*args, **kwargs) [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] return func(*args, **kwargs) [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] raise e [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] nwinfo = self.network_api.allocate_for_instance( [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 854.321162] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] created_port_ids = self._update_ports_for_instance( [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] with excutils.save_and_reraise_exception(): [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self.force_reraise() [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] raise self.value [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] updated_port = self._update_port( [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] _ensure_no_port_binding_failure(port) [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 854.321534] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] raise exception.PortBindingFailed(port_id=port['id']) [ 854.321891] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] nova.exception.PortBindingFailed: Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. [ 854.321891] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] [ 854.321891] env[61439]: INFO nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Terminating instance [ 854.328284] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 854.328284] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 854.328488] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 854.405560] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 854.927233] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 854.945982] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 854.946292] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 854.946464] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 854.947068] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4ce36962-63f3-46be-adc4-642b9e1d61af {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.963566] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b62db6cf-0e15-4d24-8bf0-10be2d501bef {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.997178] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ce8c0523-8a11-4da8-af00-ee6b246ffac4 could not be found. [ 854.997981] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 854.998318] env[61439]: INFO nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Took 0.05 seconds to destroy the instance on the hypervisor. [ 854.998604] env[61439]: DEBUG oslo.service.loopingcall [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 855.001026] env[61439]: DEBUG nova.compute.manager [-] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 855.001026] env[61439]: DEBUG nova.network.neutron [-] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 855.004574] env[61439]: DEBUG nova.compute.manager [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Received event network-changed-c5a50d49-6f28-4037-a334-593256394f2c {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 855.005120] env[61439]: DEBUG nova.compute.manager [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Refreshing instance network info cache due to event network-changed-c5a50d49-6f28-4037-a334-593256394f2c. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 855.005120] env[61439]: DEBUG oslo_concurrency.lockutils [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] Acquiring lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 855.005120] env[61439]: DEBUG oslo_concurrency.lockutils [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] Acquired lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 855.005395] env[61439]: DEBUG nova.network.neutron [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Refreshing network info cache for port c5a50d49-6f28-4037-a334-593256394f2c {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 855.013666] env[61439]: DEBUG nova.network.neutron [-] [instance: 53400571-766c-4020-b163-87a8816199cd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.028406] env[61439]: INFO nova.compute.manager [-] [instance: 53400571-766c-4020-b163-87a8816199cd] Took 2.67 seconds to deallocate network for instance. [ 855.031011] env[61439]: DEBUG nova.compute.claims [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 855.031011] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.031011] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 855.055690] env[61439]: DEBUG nova.network.neutron [-] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 855.066844] env[61439]: DEBUG nova.network.neutron [-] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.090051] env[61439]: INFO nova.compute.manager [-] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Took 0.09 seconds to deallocate network for instance. [ 855.092121] env[61439]: DEBUG nova.compute.claims [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 855.094189] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.117079] env[61439]: DEBUG nova.network.neutron [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 855.232682] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e54e763-7cf3-44fe-8cb9-2f2b2a1dd168 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.247316] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b69a3c2-3ea4-49b0-a773-95d632ed9b55 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.286320] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b84783e0-6104-4965-b8e5-54d9da30b337 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.294611] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51a4b270-4d54-4ecd-96e1-733a38df65c5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.314233] env[61439]: DEBUG nova.compute.provider_tree [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 855.329860] env[61439]: DEBUG nova.scheduler.client.report [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 855.348401] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.317s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.349216] env[61439]: ERROR nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] Traceback (most recent call last): [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self.driver.spawn(context, instance, image_meta, [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] vm_ref = self.build_virtual_machine(instance, [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] vif_infos = vmwarevif.get_vif_info(self._session, [ 855.349216] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] for vif in network_info: [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] return self._sync_wrapper(fn, *args, **kwargs) [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self.wait() [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self[:] = self._gt.wait() [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] return self._exit_event.wait() [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] result = hub.switch() [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 855.349761] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] return self.greenlet.switch() [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] result = function(*args, **kwargs) [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] return func(*args, **kwargs) [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] raise e [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] nwinfo = self.network_api.allocate_for_instance( [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] created_port_ids = self._update_ports_for_instance( [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] with excutils.save_and_reraise_exception(): [ 855.350249] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] self.force_reraise() [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] raise self.value [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] updated_port = self._update_port( [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] _ensure_no_port_binding_failure(port) [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] raise exception.PortBindingFailed(port_id=port['id']) [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] nova.exception.PortBindingFailed: Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. [ 855.350728] env[61439]: ERROR nova.compute.manager [instance: 53400571-766c-4020-b163-87a8816199cd] [ 855.353745] env[61439]: DEBUG nova.compute.utils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 855.354517] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.262s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 855.359791] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Build of instance 53400571-766c-4020-b163-87a8816199cd was re-scheduled: Binding failed for port 42f49d5f-4672-4e9e-9e10-a5ab004efa63, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 855.359791] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 855.359791] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 855.359791] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquired lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 855.360060] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 855.411162] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 855.549179] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2044bd7a-37af-404d-a78c-2b13ec76b8be {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.560061] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f57ebfda-cdca-453d-879b-46da803af0ef {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.604163] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cccbe208-2214-4e71-b9b6-a380bd73da02 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.613043] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de59abb-735e-4ab5-8f86-43111ce87ada {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.631546] env[61439]: DEBUG nova.compute.provider_tree [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 855.643025] env[61439]: DEBUG nova.scheduler.client.report [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 855.657562] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.303s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.658255] env[61439]: ERROR nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Traceback (most recent call last): [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self.driver.spawn(context, instance, image_meta, [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] vm_ref = self.build_virtual_machine(instance, [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] vif_infos = vmwarevif.get_vif_info(self._session, [ 855.658255] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] for vif in network_info: [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] return self._sync_wrapper(fn, *args, **kwargs) [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self.wait() [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self[:] = self._gt.wait() [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] return self._exit_event.wait() [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] result = hub.switch() [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 855.658782] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] return self.greenlet.switch() [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] result = function(*args, **kwargs) [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] return func(*args, **kwargs) [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] raise e [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] nwinfo = self.network_api.allocate_for_instance( [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] created_port_ids = self._update_ports_for_instance( [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] with excutils.save_and_reraise_exception(): [ 855.659364] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] self.force_reraise() [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] raise self.value [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] updated_port = self._update_port( [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] _ensure_no_port_binding_failure(port) [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] raise exception.PortBindingFailed(port_id=port['id']) [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] nova.exception.PortBindingFailed: Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. [ 855.660357] env[61439]: ERROR nova.compute.manager [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] [ 855.660861] env[61439]: DEBUG nova.compute.utils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 855.660861] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Build of instance ce8c0523-8a11-4da8-af00-ee6b246ffac4 was re-scheduled: Binding failed for port c5a50d49-6f28-4037-a334-593256394f2c, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 855.661000] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 855.661220] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 855.749095] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquiring lock "d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.749347] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Lock "d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 855.763616] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 855.769219] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.783196] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Releasing lock "refresh_cache-53400571-766c-4020-b163-87a8816199cd" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 855.783444] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 855.783793] env[61439]: DEBUG nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 855.783894] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 855.829112] env[61439]: DEBUG nova.network.neutron [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.840666] env[61439]: DEBUG oslo_concurrency.lockutils [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] Releasing lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 855.840666] env[61439]: DEBUG nova.compute.manager [req-72cae087-c02c-4dcd-b3c4-26eb34ef5ead req-1858ed19-976c-42d4-b17a-201655a308bb service nova] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Received event network-vif-deleted-c5a50d49-6f28-4037-a334-593256394f2c {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 855.841323] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 855.843652] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 855.843894] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 855.848884] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.849134] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 855.850614] env[61439]: INFO nova.compute.claims [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 855.854498] env[61439]: DEBUG nova.network.neutron [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.862896] env[61439]: INFO nova.compute.manager [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: 53400571-766c-4020-b163-87a8816199cd] Took 0.08 seconds to deallocate network for instance. [ 855.932293] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 855.979654] env[61439]: INFO nova.scheduler.client.report [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Deleted allocations for instance 53400571-766c-4020-b163-87a8816199cd [ 856.007131] env[61439]: DEBUG oslo_concurrency.lockutils [None req-19035daf-7f73-468d-8208-a485a3caba8f tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "53400571-766c-4020-b163-87a8816199cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 22.676s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.040797] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0da93e26-37dd-484d-a94b-6b326d7e268d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.050114] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d774cd1-cc7e-4203-b16c-580becb260ce {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.082221] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94f38af6-2623-4fcd-bb96-1b353dafea8e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.090494] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7e550e5-0e79-49df-b6d3-75c91433c2d0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.104705] env[61439]: DEBUG nova.compute.provider_tree [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 856.114609] env[61439]: DEBUG nova.scheduler.client.report [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 856.130446] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.130958] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 856.175597] env[61439]: DEBUG nova.compute.utils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 856.176889] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 856.177063] env[61439]: DEBUG nova.network.neutron [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 856.191223] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 856.268039] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 856.308163] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 856.308163] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 856.308163] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 856.308431] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 856.308431] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 856.308431] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 856.308777] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 856.309085] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 856.309573] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 856.309973] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 856.310404] env[61439]: DEBUG nova.virt.hardware [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 856.311387] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96fdb87e-bf35-49d4-ae35-7e557a338eb2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.322698] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20d38cf7-e4b5-4968-8ad4-5af59b08eeff {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.556283] env[61439]: DEBUG nova.policy [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5bf57aa5f880432aac740791e76d30d8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec2e5213b768477f82c160e45d6ebe6b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 856.557113] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 856.570739] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-ce8c0523-8a11-4da8-af00-ee6b246ffac4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 856.570964] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 856.571161] env[61439]: DEBUG nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 856.571324] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 856.648158] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 856.664811] env[61439]: DEBUG nova.network.neutron [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 856.679816] env[61439]: INFO nova.compute.manager [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: ce8c0523-8a11-4da8-af00-ee6b246ffac4] Took 0.11 seconds to deallocate network for instance. [ 856.811523] env[61439]: INFO nova.scheduler.client.report [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleted allocations for instance ce8c0523-8a11-4da8-af00-ee6b246ffac4 [ 856.836346] env[61439]: DEBUG oslo_concurrency.lockutils [None req-51a546be-8e81-49cd-94d2-ac10bfa0e01a tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "ce8c0523-8a11-4da8-af00-ee6b246ffac4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.559s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.914944] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquiring lock "eeed1f83-89d2-4887-9e44-b269a2e295ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 856.915275] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Lock "eeed1f83-89d2-4887-9e44-b269a2e295ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 856.932105] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 857.020969] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 857.021305] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 857.022856] env[61439]: INFO nova.compute.claims [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 857.176218] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2711dfc-7204-4723-b7e1-741d356eb76f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.187115] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bb176bf-2116-4faa-bd35-7fbb2d73751f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.221095] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe11c4fa-5647-42c9-b710-244cffcf05f8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.229496] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14774345-6a42-4782-bdf4-af1ca26e3fc9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.246238] env[61439]: DEBUG nova.compute.provider_tree [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 857.256659] env[61439]: DEBUG nova.scheduler.client.report [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 857.279126] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 857.279914] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 857.318486] env[61439]: DEBUG nova.compute.utils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 857.319771] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Not allocating networking since 'none' was specified. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 857.333287] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 857.437160] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 857.480349] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 857.480533] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 857.481135] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 857.481135] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 857.481135] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 857.481676] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 857.481676] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 857.481906] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 857.482442] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 857.482442] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 857.482442] env[61439]: DEBUG nova.virt.hardware [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 857.483697] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19339de1-dfcb-43ef-9450-e9408b6852de {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.489890] env[61439]: DEBUG nova.network.neutron [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Successfully created port: b3514446-5d31-494d-bb57-807994fe501e {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 857.506108] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69beea5b-28b8-42b0-8329-29153cdd753d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.529519] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Instance VIF info [] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 857.537849] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Creating folder: Project (de0bdfa4bde94371a7f9ec77e16e894f). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 857.537849] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5d538ee9-3f84-4bd9-bda2-3232e4f3742c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.555341] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Created folder: Project (de0bdfa4bde94371a7f9ec77e16e894f) in parent group-v221281. [ 857.555341] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Creating folder: Instances. Parent ref: group-v221306. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 857.555341] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c33c0806-a409-443a-8f43-557783e8c1d5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.572209] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Created folder: Instances in parent group-v221306. [ 857.572209] env[61439]: DEBUG oslo.service.loopingcall [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 857.572209] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 857.572209] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7b294163-a803-4796-ab56-9035785968fc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.598546] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 857.598546] env[61439]: value = "task-987691" [ 857.598546] env[61439]: _type = "Task" [ 857.598546] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 857.612256] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987691, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 857.966200] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "e51da790-9736-4181-9562-1a8f87895bd2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 857.966200] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "e51da790-9736-4181-9562-1a8f87895bd2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 857.977071] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 858.040973] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 858.040973] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 858.040973] env[61439]: INFO nova.compute.claims [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 858.116394] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987691, 'name': CreateVM_Task, 'duration_secs': 0.280386} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 858.116810] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 858.117734] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 858.117734] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 858.117734] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 858.117734] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3bc7e113-3ffb-461e-ac1a-803d1f708d12 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.122492] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Waiting for the task: (returnval){ [ 858.122492] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]520d3ea5-01bf-6a8e-db14-bb6504e04298" [ 858.122492] env[61439]: _type = "Task" [ 858.122492] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 858.134147] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]520d3ea5-01bf-6a8e-db14-bb6504e04298, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 858.225048] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95bddac5-6271-45d8-b269-aabac964b33b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.233589] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f98636ee-adbb-40b8-9f39-12843457708a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.270552] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-802388d6-dc26-43e1-aa46-026e7d56385d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.282463] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a89d3fb-1b30-4ef1-94fd-4d8f8c232235 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.298016] env[61439]: DEBUG nova.compute.provider_tree [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 858.310845] env[61439]: DEBUG nova.scheduler.client.report [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 858.330582] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 858.330582] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 858.382839] env[61439]: DEBUG nova.compute.utils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 858.384263] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 858.385036] env[61439]: DEBUG nova.network.neutron [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 858.399712] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 858.476354] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 858.508644] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 858.508889] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 858.509146] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 858.509247] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 858.509531] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 858.509531] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 858.509723] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 858.509923] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 858.510070] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 858.510274] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 858.510433] env[61439]: DEBUG nova.virt.hardware [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 858.512201] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f73c564-0644-41b5-80b7-14ec0fa61b49 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.521175] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-676ebcec-72a6-43c3-bb03-d2b00d41e3e5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.633735] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 858.633735] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 858.633735] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 858.663972] env[61439]: DEBUG nova.policy [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5c5974ebfc844e4c8a2947542dd55524', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ff88efffe9443a39391aab1d573993a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 858.843116] env[61439]: DEBUG nova.compute.manager [req-3dfb7fc9-3973-4699-ae08-80b9e72d4dd2 req-17839c22-b844-4811-b2c5-6527e976ee70 service nova] [instance: c9574f78-b316-467b-acbe-2122886c2990] Received event network-changed-03976688-9968-4c5b-b4cf-f1d6882dd680 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 858.843513] env[61439]: DEBUG nova.compute.manager [req-3dfb7fc9-3973-4699-ae08-80b9e72d4dd2 req-17839c22-b844-4811-b2c5-6527e976ee70 service nova] [instance: c9574f78-b316-467b-acbe-2122886c2990] Refreshing instance network info cache due to event network-changed-03976688-9968-4c5b-b4cf-f1d6882dd680. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 858.843624] env[61439]: DEBUG oslo_concurrency.lockutils [req-3dfb7fc9-3973-4699-ae08-80b9e72d4dd2 req-17839c22-b844-4811-b2c5-6527e976ee70 service nova] Acquiring lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 858.843808] env[61439]: DEBUG oslo_concurrency.lockutils [req-3dfb7fc9-3973-4699-ae08-80b9e72d4dd2 req-17839c22-b844-4811-b2c5-6527e976ee70 service nova] Acquired lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 858.843905] env[61439]: DEBUG nova.network.neutron [req-3dfb7fc9-3973-4699-ae08-80b9e72d4dd2 req-17839c22-b844-4811-b2c5-6527e976ee70 service nova] [instance: c9574f78-b316-467b-acbe-2122886c2990] Refreshing network info cache for port 03976688-9968-4c5b-b4cf-f1d6882dd680 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 858.910887] env[61439]: ERROR nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. [ 858.910887] env[61439]: ERROR nova.compute.manager Traceback (most recent call last): [ 858.910887] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 858.910887] env[61439]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 858.910887] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 858.910887] env[61439]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 858.910887] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 858.910887] env[61439]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 858.910887] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 858.910887] env[61439]: ERROR nova.compute.manager self.force_reraise() [ 858.910887] env[61439]: ERROR nova.compute.manager File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 858.910887] env[61439]: ERROR nova.compute.manager raise self.value [ 858.910887] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 858.910887] env[61439]: ERROR nova.compute.manager updated_port = self._update_port( [ 858.910887] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 858.910887] env[61439]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 858.911571] env[61439]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 858.911571] env[61439]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 858.911571] env[61439]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. [ 858.911571] env[61439]: ERROR nova.compute.manager [ 858.911571] env[61439]: Traceback (most recent call last): [ 858.911571] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/poll.py", line 111, in wait [ 858.911571] env[61439]: listener.cb(fileno) [ 858.911571] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 858.911571] env[61439]: result = function(*args, **kwargs) [ 858.911571] env[61439]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 858.911571] env[61439]: return func(*args, **kwargs) [ 858.911571] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 858.911571] env[61439]: raise e [ 858.911571] env[61439]: File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 858.911571] env[61439]: nwinfo = self.network_api.allocate_for_instance( [ 858.911571] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 858.911571] env[61439]: created_port_ids = self._update_ports_for_instance( [ 858.911571] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 858.911571] env[61439]: with excutils.save_and_reraise_exception(): [ 858.911571] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 858.911571] env[61439]: self.force_reraise() [ 858.911571] env[61439]: File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 858.911571] env[61439]: raise self.value [ 858.911571] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 858.911571] env[61439]: updated_port = self._update_port( [ 858.911571] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 858.911571] env[61439]: _ensure_no_port_binding_failure(port) [ 858.911571] env[61439]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 858.911571] env[61439]: raise exception.PortBindingFailed(port_id=port['id']) [ 858.912445] env[61439]: nova.exception.PortBindingFailed: Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. [ 858.912445] env[61439]: Removing descriptor: 25 [ 858.912445] env[61439]: ERROR nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] Traceback (most recent call last): [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] yield resources [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self.driver.spawn(context, instance, image_meta, [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self._vmops.spawn(context, instance, image_meta, injected_files, [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 858.912445] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] vm_ref = self.build_virtual_machine(instance, [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] vif_infos = vmwarevif.get_vif_info(self._session, [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] for vif in network_info: [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] return self._sync_wrapper(fn, *args, **kwargs) [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self.wait() [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self[:] = self._gt.wait() [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] return self._exit_event.wait() [ 858.912870] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] result = hub.switch() [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] return self.greenlet.switch() [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] result = function(*args, **kwargs) [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] return func(*args, **kwargs) [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] raise e [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] nwinfo = self.network_api.allocate_for_instance( [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 858.913266] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] created_port_ids = self._update_ports_for_instance( [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] with excutils.save_and_reraise_exception(): [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self.force_reraise() [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] raise self.value [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] updated_port = self._update_port( [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] _ensure_no_port_binding_failure(port) [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 858.913659] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] raise exception.PortBindingFailed(port_id=port['id']) [ 858.914141] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] nova.exception.PortBindingFailed: Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. [ 858.914141] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] [ 858.914141] env[61439]: INFO nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Terminating instance [ 858.914141] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 858.929283] env[61439]: DEBUG nova.network.neutron [req-3dfb7fc9-3973-4699-ae08-80b9e72d4dd2 req-17839c22-b844-4811-b2c5-6527e976ee70 service nova] [instance: c9574f78-b316-467b-acbe-2122886c2990] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 859.133222] env[61439]: DEBUG nova.network.neutron [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Successfully updated port: b3514446-5d31-494d-bb57-807994fe501e {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 859.159876] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquiring lock "refresh_cache-d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 859.160131] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquired lock "refresh_cache-d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 859.160323] env[61439]: DEBUG nova.network.neutron [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 859.227166] env[61439]: DEBUG nova.network.neutron [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 859.451300] env[61439]: DEBUG nova.network.neutron [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Updating instance_info_cache with network_info: [{"id": "b3514446-5d31-494d-bb57-807994fe501e", "address": "fa:16:3e:9b:4d:0a", "network": {"id": "31cdd88d-c715-4a6b-9e68-24c2cd292589", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1792630979-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ec2e5213b768477f82c160e45d6ebe6b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e1d25020-c621-4388-ac1d-de55bfefbe50", "external-id": "nsx-vlan-transportzone-573", "segmentation_id": 573, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3514446-5d", "ovs_interfaceid": "b3514446-5d31-494d-bb57-807994fe501e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 859.469340] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Releasing lock "refresh_cache-d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 859.469672] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Instance network_info: |[{"id": "b3514446-5d31-494d-bb57-807994fe501e", "address": "fa:16:3e:9b:4d:0a", "network": {"id": "31cdd88d-c715-4a6b-9e68-24c2cd292589", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1792630979-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ec2e5213b768477f82c160e45d6ebe6b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e1d25020-c621-4388-ac1d-de55bfefbe50", "external-id": "nsx-vlan-transportzone-573", "segmentation_id": 573, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3514446-5d", "ovs_interfaceid": "b3514446-5d31-494d-bb57-807994fe501e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 859.470531] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:4d:0a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e1d25020-c621-4388-ac1d-de55bfefbe50', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b3514446-5d31-494d-bb57-807994fe501e', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 859.481978] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Creating folder: Project (ec2e5213b768477f82c160e45d6ebe6b). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 859.482605] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e78a1976-ba78-40d2-9a93-485582b6a390 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.493602] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Created folder: Project (ec2e5213b768477f82c160e45d6ebe6b) in parent group-v221281. [ 859.494059] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Creating folder: Instances. Parent ref: group-v221309. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 859.494059] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7bd986af-72b0-4984-af46-ac1a6a1bdd4c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.503227] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Created folder: Instances in parent group-v221309. [ 859.503460] env[61439]: DEBUG oslo.service.loopingcall [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 859.503644] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 859.503843] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-231d9597-9fd3-44c9-b4bc-dce4718cbd92 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.523218] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 859.523218] env[61439]: value = "task-987694" [ 859.523218] env[61439]: _type = "Task" [ 859.523218] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 859.532141] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987694, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 859.706049] env[61439]: DEBUG nova.network.neutron [req-3dfb7fc9-3973-4699-ae08-80b9e72d4dd2 req-17839c22-b844-4811-b2c5-6527e976ee70 service nova] [instance: c9574f78-b316-467b-acbe-2122886c2990] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 859.717500] env[61439]: DEBUG oslo_concurrency.lockutils [req-3dfb7fc9-3973-4699-ae08-80b9e72d4dd2 req-17839c22-b844-4811-b2c5-6527e976ee70 service nova] Releasing lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 859.717941] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquired lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 859.718161] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 859.799242] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 859.926326] env[61439]: DEBUG nova.compute.manager [req-648c74bc-3add-41b3-b617-84bb081437d7 req-8d89f115-0eff-4384-ac17-5a1681c2e0f0 service nova] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Received event network-vif-plugged-b3514446-5d31-494d-bb57-807994fe501e {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 859.926488] env[61439]: DEBUG oslo_concurrency.lockutils [req-648c74bc-3add-41b3-b617-84bb081437d7 req-8d89f115-0eff-4384-ac17-5a1681c2e0f0 service nova] Acquiring lock "d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 859.926711] env[61439]: DEBUG oslo_concurrency.lockutils [req-648c74bc-3add-41b3-b617-84bb081437d7 req-8d89f115-0eff-4384-ac17-5a1681c2e0f0 service nova] Lock "d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 859.926930] env[61439]: DEBUG oslo_concurrency.lockutils [req-648c74bc-3add-41b3-b617-84bb081437d7 req-8d89f115-0eff-4384-ac17-5a1681c2e0f0 service nova] Lock "d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 859.928629] env[61439]: DEBUG nova.compute.manager [req-648c74bc-3add-41b3-b617-84bb081437d7 req-8d89f115-0eff-4384-ac17-5a1681c2e0f0 service nova] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] No waiting events found dispatching network-vif-plugged-b3514446-5d31-494d-bb57-807994fe501e {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 859.928629] env[61439]: WARNING nova.compute.manager [req-648c74bc-3add-41b3-b617-84bb081437d7 req-8d89f115-0eff-4384-ac17-5a1681c2e0f0 service nova] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Received unexpected event network-vif-plugged-b3514446-5d31-494d-bb57-807994fe501e for instance with vm_state building and task_state spawning. [ 860.034629] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987694, 'name': CreateVM_Task, 'duration_secs': 0.353273} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 860.034815] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 860.050162] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 860.050338] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 860.050653] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 860.051045] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b9b87bf0-7027-4059-be96-6e1d896fcca5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.056281] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Waiting for the task: (returnval){ [ 860.056281] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]529b9666-dbe7-aca9-49a7-7b68d4c2e484" [ 860.056281] env[61439]: _type = "Task" [ 860.056281] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 860.066540] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]529b9666-dbe7-aca9-49a7-7b68d4c2e484, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 860.094405] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.109357] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Releasing lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 860.109790] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 860.109983] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 860.110595] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-60d2505b-cc62-4567-b44e-73549fb9b9f9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.122687] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83528a39-5337-4436-aae5-1da15aebdb26 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.149172] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c9574f78-b316-467b-acbe-2122886c2990 could not be found. [ 860.149494] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 860.149585] env[61439]: INFO nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Took 0.04 seconds to destroy the instance on the hypervisor. [ 860.149834] env[61439]: DEBUG oslo.service.loopingcall [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 860.150083] env[61439]: DEBUG nova.compute.manager [-] [instance: c9574f78-b316-467b-acbe-2122886c2990] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 860.150219] env[61439]: DEBUG nova.network.neutron [-] [instance: c9574f78-b316-467b-acbe-2122886c2990] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 860.197239] env[61439]: DEBUG nova.network.neutron [-] [instance: c9574f78-b316-467b-acbe-2122886c2990] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 860.201240] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquiring lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 860.201660] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 860.211712] env[61439]: DEBUG nova.network.neutron [-] [instance: c9574f78-b316-467b-acbe-2122886c2990] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.219284] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 860.225772] env[61439]: INFO nova.compute.manager [-] [instance: c9574f78-b316-467b-acbe-2122886c2990] Took 0.08 seconds to deallocate network for instance. [ 860.227842] env[61439]: DEBUG nova.compute.claims [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 860.228568] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 860.232014] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 860.253824] env[61439]: DEBUG nova.network.neutron [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Successfully created port: 4982006c-7d3f-4f66-80f8-ceba8f534c65 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 860.321170] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 860.421186] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-545f7d31-be73-4c15-a4dd-7725226fbf0a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.431586] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c6f9b4c-ae22-46d2-8fa3-f297fc4685eb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.474358] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cb1fd42-d73a-40b9-bbb1-0b9795faee09 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.485069] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a776321a-3fc2-43db-ac46-d1f4c181ec45 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.499703] env[61439]: DEBUG nova.compute.provider_tree [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 860.508949] env[61439]: DEBUG nova.scheduler.client.report [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 860.528567] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.299s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 860.529207] env[61439]: ERROR nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] Traceback (most recent call last): [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self.driver.spawn(context, instance, image_meta, [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self._vmops.spawn(context, instance, image_meta, injected_files, [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] vm_ref = self.build_virtual_machine(instance, [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] vif_infos = vmwarevif.get_vif_info(self._session, [ 860.529207] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] for vif in network_info: [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] return self._sync_wrapper(fn, *args, **kwargs) [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self.wait() [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self[:] = self._gt.wait() [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 181, in wait [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] return self._exit_event.wait() [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] result = hub.switch() [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 860.529552] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] return self.greenlet.switch() [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] result = function(*args, **kwargs) [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] return func(*args, **kwargs) [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/compute/manager.py", line 1986, in _allocate_network_async [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] raise e [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/compute/manager.py", line 1964, in _allocate_network_async [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] nwinfo = self.network_api.allocate_for_instance( [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] created_port_ids = self._update_ports_for_instance( [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] with excutils.save_and_reraise_exception(): [ 860.529987] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] self.force_reraise() [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] raise self.value [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] updated_port = self._update_port( [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] _ensure_no_port_binding_failure(port) [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] raise exception.PortBindingFailed(port_id=port['id']) [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] nova.exception.PortBindingFailed: Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. [ 860.530395] env[61439]: ERROR nova.compute.manager [instance: c9574f78-b316-467b-acbe-2122886c2990] [ 860.530739] env[61439]: DEBUG nova.compute.utils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 860.530967] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.210s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 860.533963] env[61439]: INFO nova.compute.claims [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 860.537160] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Build of instance c9574f78-b316-467b-acbe-2122886c2990 was re-scheduled: Binding failed for port 03976688-9968-4c5b-b4cf-f1d6882dd680, please check neutron logs for more information. {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 860.537578] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 860.537802] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquiring lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 860.537953] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Acquired lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 860.538195] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 860.567922] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 860.568342] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 860.568566] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 860.583655] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 860.593822] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 860.594070] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 860.607636] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 860.678947] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 860.736887] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03218f63-3d6e-479d-a1ca-2ad170b63519 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.745218] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43978b55-9ac5-4817-be94-32de893c04dd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.775962] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94394a84-831e-4289-a190-afd0b9188fbe {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.783605] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b417a41-97ac-417e-9b2a-cc15b63f9147 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 860.796478] env[61439]: DEBUG nova.compute.provider_tree [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 860.806584] env[61439]: DEBUG nova.scheduler.client.report [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 860.825490] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.828879] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 860.828879] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 860.830300] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.151s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 860.831647] env[61439]: INFO nova.compute.claims [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 860.836155] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Releasing lock "refresh_cache-c9574f78-b316-467b-acbe-2122886c2990" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 860.836155] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 860.836155] env[61439]: DEBUG nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 860.836155] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 860.861840] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 860.864690] env[61439]: DEBUG nova.compute.utils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 860.866813] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 860.866813] env[61439]: DEBUG nova.network.neutron [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 860.869478] env[61439]: DEBUG nova.network.neutron [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.882150] env[61439]: INFO nova.compute.manager [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] [instance: c9574f78-b316-467b-acbe-2122886c2990] Took 0.05 seconds to deallocate network for instance. [ 860.901932] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 860.983657] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 861.001371] env[61439]: DEBUG nova.policy [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '83e896d2715d4448aaed2f994e6e88e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '473b8bea722142ecbc6aa9cc36556e7a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 861.020022] env[61439]: INFO nova.scheduler.client.report [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Deleted allocations for instance c9574f78-b316-467b-acbe-2122886c2990 [ 861.027685] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 861.029152] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 861.029152] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 861.029152] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 861.029152] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 861.029152] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 861.029527] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 861.029527] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 861.029527] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 861.029527] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 861.029527] env[61439]: DEBUG nova.virt.hardware [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 861.032432] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c78d258-0265-49a1-890e-53be4bd8ca46 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.044821] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21536477-4f82-414d-9bbc-5cbc0211f74b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.065763] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fab0fa52-8b5c-420f-93a2-fcc49f6cf44e tempest-AttachVolumeNegativeTest-1225387431 tempest-AttachVolumeNegativeTest-1225387431-project-member] Lock "c9574f78-b316-467b-acbe-2122886c2990" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.613s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.081698] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17eb232e-b2af-4c63-9d7e-4b0b6b89daef {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.089133] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bfcf22b-dbbd-48be-8e42-87a1619649e0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.123797] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b731f93d-be95-4e7e-8d29-4c4fa303e2e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.131719] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bf782a4-2384-4913-87fb-160d38360671 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.151050] env[61439]: DEBUG nova.compute.provider_tree [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 861.162541] env[61439]: DEBUG nova.scheduler.client.report [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 861.179473] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 861.179757] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 861.229399] env[61439]: DEBUG nova.compute.utils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 861.231167] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 861.231350] env[61439]: DEBUG nova.network.neutron [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 861.241755] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 861.314682] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 861.322132] env[61439]: DEBUG nova.policy [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af2fd8431af45ca891f744f4d10b54f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca364a2df93a424f8b66ee39d9b0b120', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 861.344158] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 861.344412] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 861.344573] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 861.344799] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 861.344955] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 861.345130] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 861.345347] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 861.345509] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 861.345677] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 861.345839] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 861.346018] env[61439]: DEBUG nova.virt.hardware [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 861.346908] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-835210f1-72e9-42a7-817f-4bfa1dbcaa94 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.357773] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c79eb93-fc50-4cee-9998-d8ad18c82e30 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 861.839802] env[61439]: DEBUG nova.network.neutron [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Successfully created port: f3361e85-65a2-4f3c-80d6-17a3aeb90792 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 862.081395] env[61439]: DEBUG nova.compute.manager [req-144f9374-fdde-4bc2-8725-e521be1a9d40 req-abc7095a-4085-4917-b560-b72b6038602b service nova] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Received event network-changed-b3514446-5d31-494d-bb57-807994fe501e {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 862.081782] env[61439]: DEBUG nova.compute.manager [req-144f9374-fdde-4bc2-8725-e521be1a9d40 req-abc7095a-4085-4917-b560-b72b6038602b service nova] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Refreshing instance network info cache due to event network-changed-b3514446-5d31-494d-bb57-807994fe501e. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 862.082116] env[61439]: DEBUG oslo_concurrency.lockutils [req-144f9374-fdde-4bc2-8725-e521be1a9d40 req-abc7095a-4085-4917-b560-b72b6038602b service nova] Acquiring lock "refresh_cache-d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 862.082383] env[61439]: DEBUG oslo_concurrency.lockutils [req-144f9374-fdde-4bc2-8725-e521be1a9d40 req-abc7095a-4085-4917-b560-b72b6038602b service nova] Acquired lock "refresh_cache-d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 862.082652] env[61439]: DEBUG nova.network.neutron [req-144f9374-fdde-4bc2-8725-e521be1a9d40 req-abc7095a-4085-4917-b560-b72b6038602b service nova] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Refreshing network info cache for port b3514446-5d31-494d-bb57-807994fe501e {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 862.182318] env[61439]: DEBUG nova.network.neutron [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Successfully updated port: 4982006c-7d3f-4f66-80f8-ceba8f534c65 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 862.200240] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "refresh_cache-e51da790-9736-4181-9562-1a8f87895bd2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 862.200413] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquired lock "refresh_cache-e51da790-9736-4181-9562-1a8f87895bd2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 862.200567] env[61439]: DEBUG nova.network.neutron [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 862.343543] env[61439]: DEBUG nova.network.neutron [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 862.364478] env[61439]: DEBUG nova.network.neutron [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Successfully created port: 011642af-0917-4a0f-a2fd-fd291a82d866 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 863.268139] env[61439]: DEBUG nova.network.neutron [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Updating instance_info_cache with network_info: [{"id": "4982006c-7d3f-4f66-80f8-ceba8f534c65", "address": "fa:16:3e:19:e6:e1", "network": {"id": "46a4de18-523d-44a2-8e81-92b838d568cc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309006852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ff88efffe9443a39391aab1d573993a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4982006c-7d", "ovs_interfaceid": "4982006c-7d3f-4f66-80f8-ceba8f534c65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 863.279787] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Releasing lock "refresh_cache-e51da790-9736-4181-9562-1a8f87895bd2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 863.280111] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Instance network_info: |[{"id": "4982006c-7d3f-4f66-80f8-ceba8f534c65", "address": "fa:16:3e:19:e6:e1", "network": {"id": "46a4de18-523d-44a2-8e81-92b838d568cc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309006852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ff88efffe9443a39391aab1d573993a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4982006c-7d", "ovs_interfaceid": "4982006c-7d3f-4f66-80f8-ceba8f534c65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 863.280597] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:19:e6:e1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92f3cfd6-c130-4390-8910-865fbc42afd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4982006c-7d3f-4f66-80f8-ceba8f534c65', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 863.288078] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Creating folder: Project (9ff88efffe9443a39391aab1d573993a). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 863.288659] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-50596c10-258e-4cf4-a183-fc72eed17a39 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.299454] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Created folder: Project (9ff88efffe9443a39391aab1d573993a) in parent group-v221281. [ 863.299660] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Creating folder: Instances. Parent ref: group-v221312. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 863.299887] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-27f9ee78-e9ca-4238-8040-97e696497b9a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.309636] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Created folder: Instances in parent group-v221312. [ 863.309921] env[61439]: DEBUG oslo.service.loopingcall [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 863.310069] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 863.310289] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4150d02c-9e2e-4f11-9887-95571f563abf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.332698] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 863.332698] env[61439]: value = "task-987697" [ 863.332698] env[61439]: _type = "Task" [ 863.332698] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 863.337437] env[61439]: DEBUG nova.network.neutron [req-144f9374-fdde-4bc2-8725-e521be1a9d40 req-abc7095a-4085-4917-b560-b72b6038602b service nova] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Updated VIF entry in instance network info cache for port b3514446-5d31-494d-bb57-807994fe501e. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 863.337545] env[61439]: DEBUG nova.network.neutron [req-144f9374-fdde-4bc2-8725-e521be1a9d40 req-abc7095a-4085-4917-b560-b72b6038602b service nova] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Updating instance_info_cache with network_info: [{"id": "b3514446-5d31-494d-bb57-807994fe501e", "address": "fa:16:3e:9b:4d:0a", "network": {"id": "31cdd88d-c715-4a6b-9e68-24c2cd292589", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1792630979-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ec2e5213b768477f82c160e45d6ebe6b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e1d25020-c621-4388-ac1d-de55bfefbe50", "external-id": "nsx-vlan-transportzone-573", "segmentation_id": 573, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3514446-5d", "ovs_interfaceid": "b3514446-5d31-494d-bb57-807994fe501e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 863.344541] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987697, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 863.356246] env[61439]: DEBUG oslo_concurrency.lockutils [req-144f9374-fdde-4bc2-8725-e521be1a9d40 req-abc7095a-4085-4917-b560-b72b6038602b service nova] Releasing lock "refresh_cache-d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 863.383168] env[61439]: DEBUG nova.compute.manager [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Received event network-vif-plugged-4982006c-7d3f-4f66-80f8-ceba8f534c65 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 863.383438] env[61439]: DEBUG oslo_concurrency.lockutils [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] Acquiring lock "e51da790-9736-4181-9562-1a8f87895bd2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 863.383691] env[61439]: DEBUG oslo_concurrency.lockutils [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] Lock "e51da790-9736-4181-9562-1a8f87895bd2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 863.384065] env[61439]: DEBUG oslo_concurrency.lockutils [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] Lock "e51da790-9736-4181-9562-1a8f87895bd2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 863.384065] env[61439]: DEBUG nova.compute.manager [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] [instance: e51da790-9736-4181-9562-1a8f87895bd2] No waiting events found dispatching network-vif-plugged-4982006c-7d3f-4f66-80f8-ceba8f534c65 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 863.384184] env[61439]: WARNING nova.compute.manager [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Received unexpected event network-vif-plugged-4982006c-7d3f-4f66-80f8-ceba8f534c65 for instance with vm_state building and task_state spawning. [ 863.384403] env[61439]: DEBUG nova.compute.manager [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Received event network-changed-4982006c-7d3f-4f66-80f8-ceba8f534c65 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 863.384586] env[61439]: DEBUG nova.compute.manager [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Refreshing instance network info cache due to event network-changed-4982006c-7d3f-4f66-80f8-ceba8f534c65. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 863.384804] env[61439]: DEBUG oslo_concurrency.lockutils [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] Acquiring lock "refresh_cache-e51da790-9736-4181-9562-1a8f87895bd2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 863.384965] env[61439]: DEBUG oslo_concurrency.lockutils [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] Acquired lock "refresh_cache-e51da790-9736-4181-9562-1a8f87895bd2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 863.385106] env[61439]: DEBUG nova.network.neutron [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Refreshing network info cache for port 4982006c-7d3f-4f66-80f8-ceba8f534c65 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 863.386867] env[61439]: DEBUG nova.network.neutron [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Successfully updated port: f3361e85-65a2-4f3c-80d6-17a3aeb90792 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 863.408250] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "refresh_cache-8089bd3f-47e7-4490-8bfc-a1d87bf559ef" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 863.410107] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "refresh_cache-8089bd3f-47e7-4490-8bfc-a1d87bf559ef" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 863.410107] env[61439]: DEBUG nova.network.neutron [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 863.542761] env[61439]: DEBUG nova.network.neutron [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 863.850910] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987697, 'name': CreateVM_Task, 'duration_secs': 0.328496} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 863.851286] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 863.852148] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 863.854067] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 863.854067] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 863.854067] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-344ddf8f-63f9-4ef3-b188-390f361b660a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.858346] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for the task: (returnval){ [ 863.858346] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c2eaf7-6253-a610-ce2b-fe67fcab51f7" [ 863.858346] env[61439]: _type = "Task" [ 863.858346] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 863.870082] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52c2eaf7-6253-a610-ce2b-fe67fcab51f7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 864.040157] env[61439]: DEBUG nova.network.neutron [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Updating instance_info_cache with network_info: [{"id": "f3361e85-65a2-4f3c-80d6-17a3aeb90792", "address": "fa:16:3e:cd:3f:d6", "network": {"id": "d027573c-7241-4127-b4ea-bcf829929285", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-837166819-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca364a2df93a424f8b66ee39d9b0b120", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e7a44713-0af1-486e-bc0d-00e03a769fa4", "external-id": "nsx-vlan-transportzone-420", "segmentation_id": 420, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf3361e85-65", "ovs_interfaceid": "f3361e85-65a2-4f3c-80d6-17a3aeb90792", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 864.060681] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "refresh_cache-8089bd3f-47e7-4490-8bfc-a1d87bf559ef" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 864.061043] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Instance network_info: |[{"id": "f3361e85-65a2-4f3c-80d6-17a3aeb90792", "address": "fa:16:3e:cd:3f:d6", "network": {"id": "d027573c-7241-4127-b4ea-bcf829929285", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-837166819-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca364a2df93a424f8b66ee39d9b0b120", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e7a44713-0af1-486e-bc0d-00e03a769fa4", "external-id": "nsx-vlan-transportzone-420", "segmentation_id": 420, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf3361e85-65", "ovs_interfaceid": "f3361e85-65a2-4f3c-80d6-17a3aeb90792", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 864.061731] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cd:3f:d6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e7a44713-0af1-486e-bc0d-00e03a769fa4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f3361e85-65a2-4f3c-80d6-17a3aeb90792', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 864.070898] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Creating folder: Project (ca364a2df93a424f8b66ee39d9b0b120). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 864.071881] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-475b511b-4b81-40b4-97e2-eea7b71edfe8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.082351] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Created folder: Project (ca364a2df93a424f8b66ee39d9b0b120) in parent group-v221281. [ 864.084308] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Creating folder: Instances. Parent ref: group-v221315. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 864.084308] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-234634a8-dba0-49b1-a9ec-7b745c916938 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.098208] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Created folder: Instances in parent group-v221315. [ 864.098208] env[61439]: DEBUG oslo.service.loopingcall [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 864.098208] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 864.098208] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d080ed58-a7ce-4c1d-88e1-b353d67bb634 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.121010] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 864.121010] env[61439]: value = "task-987700" [ 864.121010] env[61439]: _type = "Task" [ 864.121010] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 864.129924] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987700, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 864.130823] env[61439]: DEBUG nova.network.neutron [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Updated VIF entry in instance network info cache for port 4982006c-7d3f-4f66-80f8-ceba8f534c65. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 864.131224] env[61439]: DEBUG nova.network.neutron [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Updating instance_info_cache with network_info: [{"id": "4982006c-7d3f-4f66-80f8-ceba8f534c65", "address": "fa:16:3e:19:e6:e1", "network": {"id": "46a4de18-523d-44a2-8e81-92b838d568cc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309006852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ff88efffe9443a39391aab1d573993a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4982006c-7d", "ovs_interfaceid": "4982006c-7d3f-4f66-80f8-ceba8f534c65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 864.143276] env[61439]: DEBUG oslo_concurrency.lockutils [req-a72db86c-9fa7-47ae-a686-cbefd6397e2a req-1af0b8e1-01ba-4810-984a-295154eeb0d1 service nova] Releasing lock "refresh_cache-e51da790-9736-4181-9562-1a8f87895bd2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 864.374303] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 864.374983] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 864.375416] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 864.635053] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987700, 'name': CreateVM_Task, 'duration_secs': 0.391633} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 864.635138] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 864.635850] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 864.636029] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 864.636355] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 864.636619] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b7ef7f8f-3e2e-4ada-8373-acbf91a537de {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.644051] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for the task: (returnval){ [ 864.644051] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52ce2a4c-493d-05de-2beb-9be6b215f271" [ 864.644051] env[61439]: _type = "Task" [ 864.644051] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 864.651774] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52ce2a4c-493d-05de-2beb-9be6b215f271, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 864.667396] env[61439]: DEBUG nova.network.neutron [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Successfully updated port: 011642af-0917-4a0f-a2fd-fd291a82d866 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 864.681616] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquiring lock "refresh_cache-7f0c1eef-750c-4d8f-8d90-a02898fdeee1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 864.681772] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquired lock "refresh_cache-7f0c1eef-750c-4d8f-8d90-a02898fdeee1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 864.681964] env[61439]: DEBUG nova.network.neutron [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 864.729227] env[61439]: DEBUG nova.compute.manager [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Received event network-vif-plugged-f3361e85-65a2-4f3c-80d6-17a3aeb90792 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 864.729460] env[61439]: DEBUG oslo_concurrency.lockutils [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] Acquiring lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 864.729676] env[61439]: DEBUG oslo_concurrency.lockutils [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] Lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 864.729846] env[61439]: DEBUG oslo_concurrency.lockutils [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] Lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 864.730356] env[61439]: DEBUG nova.compute.manager [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] No waiting events found dispatching network-vif-plugged-f3361e85-65a2-4f3c-80d6-17a3aeb90792 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 864.730626] env[61439]: WARNING nova.compute.manager [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Received unexpected event network-vif-plugged-f3361e85-65a2-4f3c-80d6-17a3aeb90792 for instance with vm_state building and task_state spawning. [ 864.730907] env[61439]: DEBUG nova.compute.manager [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Received event network-changed-f3361e85-65a2-4f3c-80d6-17a3aeb90792 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 864.731058] env[61439]: DEBUG nova.compute.manager [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Refreshing instance network info cache due to event network-changed-f3361e85-65a2-4f3c-80d6-17a3aeb90792. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 864.731264] env[61439]: DEBUG oslo_concurrency.lockutils [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] Acquiring lock "refresh_cache-8089bd3f-47e7-4490-8bfc-a1d87bf559ef" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 864.731451] env[61439]: DEBUG oslo_concurrency.lockutils [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] Acquired lock "refresh_cache-8089bd3f-47e7-4490-8bfc-a1d87bf559ef" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 864.731670] env[61439]: DEBUG nova.network.neutron [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Refreshing network info cache for port f3361e85-65a2-4f3c-80d6-17a3aeb90792 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 864.753862] env[61439]: DEBUG nova.network.neutron [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 864.987033] env[61439]: DEBUG nova.network.neutron [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Updating instance_info_cache with network_info: [{"id": "011642af-0917-4a0f-a2fd-fd291a82d866", "address": "fa:16:3e:0a:27:d3", "network": {"id": "9f46b2ca-ebc5-4432-94b0-2ae9d64c35f2", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1638112611-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "473b8bea722142ecbc6aa9cc36556e7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap011642af-09", "ovs_interfaceid": "011642af-0917-4a0f-a2fd-fd291a82d866", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 865.002557] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Releasing lock "refresh_cache-7f0c1eef-750c-4d8f-8d90-a02898fdeee1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 865.005276] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Instance network_info: |[{"id": "011642af-0917-4a0f-a2fd-fd291a82d866", "address": "fa:16:3e:0a:27:d3", "network": {"id": "9f46b2ca-ebc5-4432-94b0-2ae9d64c35f2", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1638112611-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "473b8bea722142ecbc6aa9cc36556e7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap011642af-09", "ovs_interfaceid": "011642af-0917-4a0f-a2fd-fd291a82d866", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 865.005384] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0a:27:d3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d88bb07-f93c-45ca-bce7-230cb1f33833', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '011642af-0917-4a0f-a2fd-fd291a82d866', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 865.013753] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Creating folder: Project (473b8bea722142ecbc6aa9cc36556e7a). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 865.014505] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5bea5cad-b2e0-4f86-87a6-6f5736f61293 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.025104] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Created folder: Project (473b8bea722142ecbc6aa9cc36556e7a) in parent group-v221281. [ 865.025329] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Creating folder: Instances. Parent ref: group-v221318. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 865.025577] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-53115db8-eaca-449c-ae0c-a8ac866e0636 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.035565] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Created folder: Instances in parent group-v221318. [ 865.037980] env[61439]: DEBUG oslo.service.loopingcall [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 865.037980] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 865.037980] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-067ce994-f9a7-43b1-9131-0e23222d74fd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.062406] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 865.062406] env[61439]: value = "task-987703" [ 865.062406] env[61439]: _type = "Task" [ 865.062406] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 865.074831] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987703, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 865.161060] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 865.162155] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 865.162155] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 865.183410] env[61439]: DEBUG nova.network.neutron [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Updated VIF entry in instance network info cache for port f3361e85-65a2-4f3c-80d6-17a3aeb90792. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 865.183410] env[61439]: DEBUG nova.network.neutron [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Updating instance_info_cache with network_info: [{"id": "f3361e85-65a2-4f3c-80d6-17a3aeb90792", "address": "fa:16:3e:cd:3f:d6", "network": {"id": "d027573c-7241-4127-b4ea-bcf829929285", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-837166819-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ca364a2df93a424f8b66ee39d9b0b120", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e7a44713-0af1-486e-bc0d-00e03a769fa4", "external-id": "nsx-vlan-transportzone-420", "segmentation_id": 420, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf3361e85-65", "ovs_interfaceid": "f3361e85-65a2-4f3c-80d6-17a3aeb90792", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 865.203398] env[61439]: DEBUG oslo_concurrency.lockutils [req-135dee93-5647-4f66-b01e-a2fe3c9208ba req-0853be34-04c9-4cfd-8df6-27c0b693cf3c service nova] Releasing lock "refresh_cache-8089bd3f-47e7-4490-8bfc-a1d87bf559ef" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 865.255916] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.256200] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 865.273361] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 865.361053] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.361053] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 865.361860] env[61439]: INFO nova.compute.claims [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 865.576255] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987703, 'name': CreateVM_Task, 'duration_secs': 0.402884} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 865.580659] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 865.583161] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 865.583161] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 865.583161] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 865.583161] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99fc6254-4eb1-4356-9af4-49a67cc9f8c6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.589029] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Waiting for the task: (returnval){ [ 865.589029] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5247c7f5-9f0f-337a-72ee-3a63f1aba0b7" [ 865.589029] env[61439]: _type = "Task" [ 865.589029] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 865.598398] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5247c7f5-9f0f-337a-72ee-3a63f1aba0b7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 865.614169] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7d830ce-7e24-4c6d-abc2-fa0c86755cc3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.623284] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52855e42-9412-4d4b-8a8b-bc24c596d442 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.656940] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a919a84-b1fe-419f-9716-b032c78951da {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.666641] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-511469e8-edc3-452b-8cca-8f67dd1b268f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.673465] env[61439]: DEBUG nova.compute.manager [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Received event network-vif-plugged-011642af-0917-4a0f-a2fd-fd291a82d866 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 865.673873] env[61439]: DEBUG oslo_concurrency.lockutils [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] Acquiring lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.674121] env[61439]: DEBUG oslo_concurrency.lockutils [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] Lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 865.674311] env[61439]: DEBUG oslo_concurrency.lockutils [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] Lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 865.674484] env[61439]: DEBUG nova.compute.manager [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] No waiting events found dispatching network-vif-plugged-011642af-0917-4a0f-a2fd-fd291a82d866 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 865.674647] env[61439]: WARNING nova.compute.manager [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Received unexpected event network-vif-plugged-011642af-0917-4a0f-a2fd-fd291a82d866 for instance with vm_state building and task_state spawning. [ 865.674801] env[61439]: DEBUG nova.compute.manager [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Received event network-changed-011642af-0917-4a0f-a2fd-fd291a82d866 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 865.674949] env[61439]: DEBUG nova.compute.manager [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Refreshing instance network info cache due to event network-changed-011642af-0917-4a0f-a2fd-fd291a82d866. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 865.676120] env[61439]: DEBUG oslo_concurrency.lockutils [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] Acquiring lock "refresh_cache-7f0c1eef-750c-4d8f-8d90-a02898fdeee1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 865.676282] env[61439]: DEBUG oslo_concurrency.lockutils [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] Acquired lock "refresh_cache-7f0c1eef-750c-4d8f-8d90-a02898fdeee1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 865.679993] env[61439]: DEBUG nova.network.neutron [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Refreshing network info cache for port 011642af-0917-4a0f-a2fd-fd291a82d866 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 865.691800] env[61439]: DEBUG nova.compute.provider_tree [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 865.704644] env[61439]: DEBUG nova.scheduler.client.report [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 865.723415] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.364s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 865.723989] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 865.774023] env[61439]: DEBUG nova.compute.utils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 865.774023] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 865.774023] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 865.786267] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 865.874121] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 865.895962] env[61439]: DEBUG nova.policy [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66924fb566fb455bada0c920137fb884', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3bab637ad53d433db6fb2017b6c0c2aa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 865.909621] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 865.909818] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 865.909976] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 865.910379] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 865.910542] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 865.910695] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 865.910964] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 865.911183] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 865.911400] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 865.911579] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 865.911757] env[61439]: DEBUG nova.virt.hardware [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 865.913036] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0d8847-a519-4094-9af2-533765bb52f6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.924485] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f974ad9e-0139-46dc-a1cd-a9c3f815b3cf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 866.016464] env[61439]: DEBUG nova.network.neutron [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Updated VIF entry in instance network info cache for port 011642af-0917-4a0f-a2fd-fd291a82d866. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 866.017509] env[61439]: DEBUG nova.network.neutron [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Updating instance_info_cache with network_info: [{"id": "011642af-0917-4a0f-a2fd-fd291a82d866", "address": "fa:16:3e:0a:27:d3", "network": {"id": "9f46b2ca-ebc5-4432-94b0-2ae9d64c35f2", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1638112611-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "473b8bea722142ecbc6aa9cc36556e7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap011642af-09", "ovs_interfaceid": "011642af-0917-4a0f-a2fd-fd291a82d866", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 866.029360] env[61439]: DEBUG oslo_concurrency.lockutils [req-4d8cd4d8-ce8f-4bed-baff-a90e6a7e389b req-5f3a63e1-116e-4b81-b349-6113494781af service nova] Releasing lock "refresh_cache-7f0c1eef-750c-4d8f-8d90-a02898fdeee1" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 866.104634] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 866.104900] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 866.105132] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 866.602819] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Successfully created port: ba78beae-5d16-4843-9ed2-3e658e21c65f {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 866.941924] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Successfully created port: c7f69129-ef2e-4d1a-b868-d8791e855ea3 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 867.929633] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 867.929977] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 867.940531] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 867.991185] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 867.991456] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 867.994905] env[61439]: INFO nova.compute.claims [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 868.099612] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Successfully updated port: ba78beae-5d16-4843-9ed2-3e658e21c65f {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 868.123308] env[61439]: DEBUG nova.compute.manager [req-ac9cbd6f-aead-43bb-ad1e-53ebdd81b692 req-61eff90c-8260-43fa-b2ad-ede8db11640d service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Received event network-vif-plugged-ba78beae-5d16-4843-9ed2-3e658e21c65f {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 868.123308] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac9cbd6f-aead-43bb-ad1e-53ebdd81b692 req-61eff90c-8260-43fa-b2ad-ede8db11640d service nova] Acquiring lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 868.123308] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac9cbd6f-aead-43bb-ad1e-53ebdd81b692 req-61eff90c-8260-43fa-b2ad-ede8db11640d service nova] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 868.123308] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac9cbd6f-aead-43bb-ad1e-53ebdd81b692 req-61eff90c-8260-43fa-b2ad-ede8db11640d service nova] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 868.123420] env[61439]: DEBUG nova.compute.manager [req-ac9cbd6f-aead-43bb-ad1e-53ebdd81b692 req-61eff90c-8260-43fa-b2ad-ede8db11640d service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] No waiting events found dispatching network-vif-plugged-ba78beae-5d16-4843-9ed2-3e658e21c65f {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 868.123420] env[61439]: WARNING nova.compute.manager [req-ac9cbd6f-aead-43bb-ad1e-53ebdd81b692 req-61eff90c-8260-43fa-b2ad-ede8db11640d service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Received unexpected event network-vif-plugged-ba78beae-5d16-4843-9ed2-3e658e21c65f for instance with vm_state building and task_state spawning. [ 868.188172] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e57986c-3f95-4728-be86-f83ea43e6f55 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.195832] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e49b6b4-8b06-4881-9629-39b1172ec037 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.225615] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf21d6f9-8774-4b37-9ffa-2fa8ff73e9a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.233162] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6681a479-1c94-48d7-9297-1d79aa970f7e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.246575] env[61439]: DEBUG nova.compute.provider_tree [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 868.255000] env[61439]: DEBUG nova.scheduler.client.report [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 868.271284] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 868.271284] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 868.305413] env[61439]: DEBUG nova.compute.utils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 868.306655] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 868.306912] env[61439]: DEBUG nova.network.neutron [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 868.316145] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 868.381962] env[61439]: DEBUG nova.policy [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de248172b0ac4968bbf4e4195681ea0b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f6b7bee2debf4cf49c138ee34e161099', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 868.385544] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 868.408889] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 868.409143] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 868.409305] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 868.409489] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 868.409636] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 868.409779] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 868.410233] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 868.410465] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 868.410577] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 868.410744] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 868.410916] env[61439]: DEBUG nova.virt.hardware [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 868.411814] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61c362d0-d788-4766-a953-07b21be1da53 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.420199] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c011585b-dff4-4f43-a623-535276719926 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.700024] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 868.700291] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 868.717947] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 868.782012] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 868.782501] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 868.786936] env[61439]: INFO nova.compute.claims [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 868.984106] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34c754fd-2920-47be-a4cc-cbaa5bcc0988 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.989316] env[61439]: DEBUG nova.network.neutron [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Successfully created port: c7a356ba-a6d2-40a9-aea1-2708c0862ac6 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 868.993962] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d21a1ec2-81c4-43de-b036-2fea65aefcb7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.027181] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96056990-12c6-4412-a633-b94e902d272c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.030355] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Successfully updated port: c7f69129-ef2e-4d1a-b868-d8791e855ea3 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 869.037991] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bb6d2f1-5afd-4a9e-82b1-839f7f0f8094 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.044359] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 869.044502] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquired lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 869.044653] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 869.056113] env[61439]: DEBUG nova.compute.provider_tree [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 869.063533] env[61439]: DEBUG nova.scheduler.client.report [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 869.079384] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.079838] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 869.094180] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 869.117520] env[61439]: DEBUG nova.compute.utils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 869.120885] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 869.120885] env[61439]: DEBUG nova.network.neutron [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 869.128998] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 869.193647] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 869.224179] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 869.224422] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 869.224579] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 869.224806] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 869.224962] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 869.225125] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 869.225338] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 869.225557] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 869.225665] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 869.225826] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 869.225995] env[61439]: DEBUG nova.virt.hardware [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 869.226907] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8304c590-f88a-4800-917a-77547e5406db {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.235295] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d620c1f-eb39-41c4-8a95-9985c65fb380 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.313833] env[61439]: DEBUG nova.policy [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de248172b0ac4968bbf4e4195681ea0b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f6b7bee2debf4cf49c138ee34e161099', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 869.547111] env[61439]: DEBUG nova.compute.manager [req-d25088de-ac1c-4044-ad40-bf9d04c74de5 req-4c2b6b92-7840-493b-ada6-486573f21f74 service nova] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Received event network-vif-plugged-c7a356ba-a6d2-40a9-aea1-2708c0862ac6 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 869.547111] env[61439]: DEBUG oslo_concurrency.lockutils [req-d25088de-ac1c-4044-ad40-bf9d04c74de5 req-4c2b6b92-7840-493b-ada6-486573f21f74 service nova] Acquiring lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 869.547307] env[61439]: DEBUG oslo_concurrency.lockutils [req-d25088de-ac1c-4044-ad40-bf9d04c74de5 req-4c2b6b92-7840-493b-ada6-486573f21f74 service nova] Lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 869.547494] env[61439]: DEBUG oslo_concurrency.lockutils [req-d25088de-ac1c-4044-ad40-bf9d04c74de5 req-4c2b6b92-7840-493b-ada6-486573f21f74 service nova] Lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.547663] env[61439]: DEBUG nova.compute.manager [req-d25088de-ac1c-4044-ad40-bf9d04c74de5 req-4c2b6b92-7840-493b-ada6-486573f21f74 service nova] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] No waiting events found dispatching network-vif-plugged-c7a356ba-a6d2-40a9-aea1-2708c0862ac6 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 869.547838] env[61439]: WARNING nova.compute.manager [req-d25088de-ac1c-4044-ad40-bf9d04c74de5 req-4c2b6b92-7840-493b-ada6-486573f21f74 service nova] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Received unexpected event network-vif-plugged-c7a356ba-a6d2-40a9-aea1-2708c0862ac6 for instance with vm_state building and task_state spawning. [ 869.591349] env[61439]: DEBUG nova.network.neutron [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Successfully updated port: c7a356ba-a6d2-40a9-aea1-2708c0862ac6 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 869.610995] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "refresh_cache-2b55d3f3-cff9-4e34-936e-ece6759cfd40" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 869.611184] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquired lock "refresh_cache-2b55d3f3-cff9-4e34-936e-ece6759cfd40" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 869.611360] env[61439]: DEBUG nova.network.neutron [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 869.661651] env[61439]: DEBUG nova.network.neutron [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 869.818861] env[61439]: DEBUG nova.network.neutron [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Updating instance_info_cache with network_info: [{"id": "c7a356ba-a6d2-40a9-aea1-2708c0862ac6", "address": "fa:16:3e:0e:e6:be", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7a356ba-a6", "ovs_interfaceid": "c7a356ba-a6d2-40a9-aea1-2708c0862ac6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 869.842525] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Releasing lock "refresh_cache-2b55d3f3-cff9-4e34-936e-ece6759cfd40" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 869.842525] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Instance network_info: |[{"id": "c7a356ba-a6d2-40a9-aea1-2708c0862ac6", "address": "fa:16:3e:0e:e6:be", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7a356ba-a6", "ovs_interfaceid": "c7a356ba-a6d2-40a9-aea1-2708c0862ac6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 869.842735] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0e:e6:be', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69e41c97-4d75-4041-ae71-321e7e9d480b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c7a356ba-a6d2-40a9-aea1-2708c0862ac6', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 869.849809] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Creating folder: Project (f6b7bee2debf4cf49c138ee34e161099). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 869.850455] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dfa337ef-bbb0-4d4d-a0bd-3bd2f9635b31 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.861938] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Created folder: Project (f6b7bee2debf4cf49c138ee34e161099) in parent group-v221281. [ 869.862140] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Creating folder: Instances. Parent ref: group-v221321. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 869.862424] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-95a01d4e-3b18-4da8-bc0a-1738359f92f1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.871556] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Created folder: Instances in parent group-v221321. [ 869.871779] env[61439]: DEBUG oslo.service.loopingcall [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 869.871955] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 869.872160] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9cd5a6f5-dc84-463f-ab3e-a49c9ae2bb18 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.892020] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 869.892020] env[61439]: value = "task-987706" [ 869.892020] env[61439]: _type = "Task" [ 869.892020] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 869.899478] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987706, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 869.942849] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Updating instance_info_cache with network_info: [{"id": "ba78beae-5d16-4843-9ed2-3e658e21c65f", "address": "fa:16:3e:f8:c3:7d", "network": {"id": "5042bd97-1b7d-4618-8179-f67f371d6dc7", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1532647065", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3bab637ad53d433db6fb2017b6c0c2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60badc2d-69d2-467d-a92e-98511f5cb0b2", "external-id": "cl2-zone-408", "segmentation_id": 408, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba78beae-5d", "ovs_interfaceid": "ba78beae-5d16-4843-9ed2-3e658e21c65f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7f69129-ef2e-4d1a-b868-d8791e855ea3", "address": "fa:16:3e:4d:50:4d", "network": {"id": "59bc9e17-a135-4870-8c79-21c0f675bf16", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1537880862", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.15", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "3bab637ad53d433db6fb2017b6c0c2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31264e2-3e0a-4dfb-ba1f-6389d7d47548", "external-id": "nsx-vlan-transportzone-233", "segmentation_id": 233, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7f69129-ef", "ovs_interfaceid": "c7f69129-ef2e-4d1a-b868-d8791e855ea3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 869.954205] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Releasing lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 869.954484] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Instance network_info: |[{"id": "ba78beae-5d16-4843-9ed2-3e658e21c65f", "address": "fa:16:3e:f8:c3:7d", "network": {"id": "5042bd97-1b7d-4618-8179-f67f371d6dc7", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1532647065", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3bab637ad53d433db6fb2017b6c0c2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60badc2d-69d2-467d-a92e-98511f5cb0b2", "external-id": "cl2-zone-408", "segmentation_id": 408, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba78beae-5d", "ovs_interfaceid": "ba78beae-5d16-4843-9ed2-3e658e21c65f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7f69129-ef2e-4d1a-b868-d8791e855ea3", "address": "fa:16:3e:4d:50:4d", "network": {"id": "59bc9e17-a135-4870-8c79-21c0f675bf16", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1537880862", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.15", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "3bab637ad53d433db6fb2017b6c0c2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31264e2-3e0a-4dfb-ba1f-6389d7d47548", "external-id": "nsx-vlan-transportzone-233", "segmentation_id": 233, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7f69129-ef", "ovs_interfaceid": "c7f69129-ef2e-4d1a-b868-d8791e855ea3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 869.955031] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f8:c3:7d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '60badc2d-69d2-467d-a92e-98511f5cb0b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ba78beae-5d16-4843-9ed2-3e658e21c65f', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:4d:50:4d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e31264e2-3e0a-4dfb-ba1f-6389d7d47548', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c7f69129-ef2e-4d1a-b868-d8791e855ea3', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 869.964371] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Creating folder: Project (3bab637ad53d433db6fb2017b6c0c2aa). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 869.964959] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d9fae28e-f626-423d-aba6-fd04706c668e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.975500] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Created folder: Project (3bab637ad53d433db6fb2017b6c0c2aa) in parent group-v221281. [ 869.975714] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Creating folder: Instances. Parent ref: group-v221324. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 869.975926] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c4749b68-dc3b-4ce1-a7a8-a16d6b5ea626 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.986609] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Created folder: Instances in parent group-v221324. [ 869.986885] env[61439]: DEBUG oslo.service.loopingcall [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 869.987035] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 869.987246] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0b56be48-0cfd-4e91-85c3-4501035930c6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.009526] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 870.009526] env[61439]: value = "task-987709" [ 870.009526] env[61439]: _type = "Task" [ 870.009526] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 870.017127] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987709, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 870.042331] env[61439]: DEBUG nova.network.neutron [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Successfully created port: e17d8ca6-05f9-4753-b2b4-d677044848ba {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 870.153037] env[61439]: DEBUG nova.compute.manager [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Received event network-changed-ba78beae-5d16-4843-9ed2-3e658e21c65f {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 870.153037] env[61439]: DEBUG nova.compute.manager [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Refreshing instance network info cache due to event network-changed-ba78beae-5d16-4843-9ed2-3e658e21c65f. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 870.153037] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Acquiring lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 870.153037] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Acquired lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 870.153037] env[61439]: DEBUG nova.network.neutron [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Refreshing network info cache for port ba78beae-5d16-4843-9ed2-3e658e21c65f {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 870.402614] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987706, 'name': CreateVM_Task, 'duration_secs': 0.490081} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 870.402803] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 870.403754] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 870.403928] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 870.404272] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 870.404535] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4aec433a-6c42-461d-b3db-bc1be05eebd6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.409691] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for the task: (returnval){ [ 870.409691] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52d20651-0343-66c5-3727-ab2a413e2eeb" [ 870.409691] env[61439]: _type = "Task" [ 870.409691] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 870.417641] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52d20651-0343-66c5-3727-ab2a413e2eeb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 870.519031] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987709, 'name': CreateVM_Task, 'duration_secs': 0.506363} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 870.519203] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 870.519914] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 870.860174] env[61439]: DEBUG nova.network.neutron [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Updated VIF entry in instance network info cache for port ba78beae-5d16-4843-9ed2-3e658e21c65f. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 870.860655] env[61439]: DEBUG nova.network.neutron [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Updating instance_info_cache with network_info: [{"id": "ba78beae-5d16-4843-9ed2-3e658e21c65f", "address": "fa:16:3e:f8:c3:7d", "network": {"id": "5042bd97-1b7d-4618-8179-f67f371d6dc7", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1532647065", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3bab637ad53d433db6fb2017b6c0c2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60badc2d-69d2-467d-a92e-98511f5cb0b2", "external-id": "cl2-zone-408", "segmentation_id": 408, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba78beae-5d", "ovs_interfaceid": "ba78beae-5d16-4843-9ed2-3e658e21c65f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7f69129-ef2e-4d1a-b868-d8791e855ea3", "address": "fa:16:3e:4d:50:4d", "network": {"id": "59bc9e17-a135-4870-8c79-21c0f675bf16", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1537880862", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.15", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "3bab637ad53d433db6fb2017b6c0c2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31264e2-3e0a-4dfb-ba1f-6389d7d47548", "external-id": "nsx-vlan-transportzone-233", "segmentation_id": 233, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7f69129-ef", "ovs_interfaceid": "c7f69129-ef2e-4d1a-b868-d8791e855ea3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.870853] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Releasing lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 870.871246] env[61439]: DEBUG nova.compute.manager [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Received event network-vif-plugged-c7f69129-ef2e-4d1a-b868-d8791e855ea3 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 870.871513] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Acquiring lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 870.871778] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 870.872007] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 870.872240] env[61439]: DEBUG nova.compute.manager [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] No waiting events found dispatching network-vif-plugged-c7f69129-ef2e-4d1a-b868-d8791e855ea3 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 870.872540] env[61439]: WARNING nova.compute.manager [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Received unexpected event network-vif-plugged-c7f69129-ef2e-4d1a-b868-d8791e855ea3 for instance with vm_state building and task_state spawning. [ 870.873103] env[61439]: DEBUG nova.compute.manager [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Received event network-changed-c7f69129-ef2e-4d1a-b868-d8791e855ea3 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 870.873288] env[61439]: DEBUG nova.compute.manager [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Refreshing instance network info cache due to event network-changed-c7f69129-ef2e-4d1a-b868-d8791e855ea3. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 870.873542] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Acquiring lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 870.873745] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Acquired lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 870.873921] env[61439]: DEBUG nova.network.neutron [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Refreshing network info cache for port c7f69129-ef2e-4d1a-b868-d8791e855ea3 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 870.919894] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 870.920158] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 870.920378] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 870.920594] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 870.920893] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 870.921162] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a51d5c9a-2a07-44f8-8ed1-d36d21ed0361 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.926128] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Waiting for the task: (returnval){ [ 870.926128] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]522694d4-ff2d-19f9-6cdb-53b8239e8e0d" [ 870.926128] env[61439]: _type = "Task" [ 870.926128] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 870.934088] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]522694d4-ff2d-19f9-6cdb-53b8239e8e0d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 871.435587] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 871.435912] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 871.436197] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 871.525468] env[61439]: DEBUG nova.network.neutron [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Updated VIF entry in instance network info cache for port c7f69129-ef2e-4d1a-b868-d8791e855ea3. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 871.525898] env[61439]: DEBUG nova.network.neutron [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Updating instance_info_cache with network_info: [{"id": "ba78beae-5d16-4843-9ed2-3e658e21c65f", "address": "fa:16:3e:f8:c3:7d", "network": {"id": "5042bd97-1b7d-4618-8179-f67f371d6dc7", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1532647065", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3bab637ad53d433db6fb2017b6c0c2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60badc2d-69d2-467d-a92e-98511f5cb0b2", "external-id": "cl2-zone-408", "segmentation_id": 408, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba78beae-5d", "ovs_interfaceid": "ba78beae-5d16-4843-9ed2-3e658e21c65f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c7f69129-ef2e-4d1a-b868-d8791e855ea3", "address": "fa:16:3e:4d:50:4d", "network": {"id": "59bc9e17-a135-4870-8c79-21c0f675bf16", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1537880862", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.15", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "3bab637ad53d433db6fb2017b6c0c2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31264e2-3e0a-4dfb-ba1f-6389d7d47548", "external-id": "nsx-vlan-transportzone-233", "segmentation_id": 233, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7f69129-ef", "ovs_interfaceid": "c7f69129-ef2e-4d1a-b868-d8791e855ea3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 871.535996] env[61439]: DEBUG oslo_concurrency.lockutils [req-59bbaae8-65ae-4d70-a7b5-603cf67a3473 req-a53a8527-9337-4ce2-b73f-9e40360b2ddd service nova] Releasing lock "refresh_cache-b954e159-4d89-4c61-a5bc-5e6c67cf278c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 871.549017] env[61439]: DEBUG nova.network.neutron [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Successfully updated port: e17d8ca6-05f9-4753-b2b4-d677044848ba {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 871.563784] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "refresh_cache-aeeb7c6c-7413-46b0-8632-c7224620e9b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 871.563974] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquired lock "refresh_cache-aeeb7c6c-7413-46b0-8632-c7224620e9b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 871.564761] env[61439]: DEBUG nova.network.neutron [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 871.576980] env[61439]: DEBUG nova.compute.manager [req-f206cc87-1dd2-4ce2-9e05-ac82ecaec35f req-ef89ee91-7a73-486a-8991-6e7938223584 service nova] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Received event network-changed-c7a356ba-a6d2-40a9-aea1-2708c0862ac6 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 871.577191] env[61439]: DEBUG nova.compute.manager [req-f206cc87-1dd2-4ce2-9e05-ac82ecaec35f req-ef89ee91-7a73-486a-8991-6e7938223584 service nova] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Refreshing instance network info cache due to event network-changed-c7a356ba-a6d2-40a9-aea1-2708c0862ac6. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 871.577399] env[61439]: DEBUG oslo_concurrency.lockutils [req-f206cc87-1dd2-4ce2-9e05-ac82ecaec35f req-ef89ee91-7a73-486a-8991-6e7938223584 service nova] Acquiring lock "refresh_cache-2b55d3f3-cff9-4e34-936e-ece6759cfd40" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 871.577545] env[61439]: DEBUG oslo_concurrency.lockutils [req-f206cc87-1dd2-4ce2-9e05-ac82ecaec35f req-ef89ee91-7a73-486a-8991-6e7938223584 service nova] Acquired lock "refresh_cache-2b55d3f3-cff9-4e34-936e-ece6759cfd40" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 871.577758] env[61439]: DEBUG nova.network.neutron [req-f206cc87-1dd2-4ce2-9e05-ac82ecaec35f req-ef89ee91-7a73-486a-8991-6e7938223584 service nova] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Refreshing network info cache for port c7a356ba-a6d2-40a9-aea1-2708c0862ac6 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 871.618008] env[61439]: DEBUG nova.network.neutron [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 872.216742] env[61439]: DEBUG nova.network.neutron [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Updating instance_info_cache with network_info: [{"id": "e17d8ca6-05f9-4753-b2b4-d677044848ba", "address": "fa:16:3e:1a:a3:80", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape17d8ca6-05", "ovs_interfaceid": "e17d8ca6-05f9-4753-b2b4-d677044848ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 872.228976] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Releasing lock "refresh_cache-aeeb7c6c-7413-46b0-8632-c7224620e9b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 872.229295] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Instance network_info: |[{"id": "e17d8ca6-05f9-4753-b2b4-d677044848ba", "address": "fa:16:3e:1a:a3:80", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape17d8ca6-05", "ovs_interfaceid": "e17d8ca6-05f9-4753-b2b4-d677044848ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 872.229676] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1a:a3:80', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69e41c97-4d75-4041-ae71-321e7e9d480b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e17d8ca6-05f9-4753-b2b4-d677044848ba', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 872.244347] env[61439]: DEBUG oslo.service.loopingcall [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 872.244893] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 872.245184] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f2db2d81-a9e6-4228-860a-08e45b99cb50 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.265093] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 872.265093] env[61439]: value = "task-987710" [ 872.265093] env[61439]: _type = "Task" [ 872.265093] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 872.272985] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987710, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 872.286634] env[61439]: DEBUG nova.network.neutron [req-f206cc87-1dd2-4ce2-9e05-ac82ecaec35f req-ef89ee91-7a73-486a-8991-6e7938223584 service nova] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Updated VIF entry in instance network info cache for port c7a356ba-a6d2-40a9-aea1-2708c0862ac6. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 872.286925] env[61439]: DEBUG nova.network.neutron [req-f206cc87-1dd2-4ce2-9e05-ac82ecaec35f req-ef89ee91-7a73-486a-8991-6e7938223584 service nova] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Updating instance_info_cache with network_info: [{"id": "c7a356ba-a6d2-40a9-aea1-2708c0862ac6", "address": "fa:16:3e:0e:e6:be", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7a356ba-a6", "ovs_interfaceid": "c7a356ba-a6d2-40a9-aea1-2708c0862ac6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 872.296167] env[61439]: DEBUG oslo_concurrency.lockutils [req-f206cc87-1dd2-4ce2-9e05-ac82ecaec35f req-ef89ee91-7a73-486a-8991-6e7938223584 service nova] Releasing lock "refresh_cache-2b55d3f3-cff9-4e34-936e-ece6759cfd40" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 872.414678] env[61439]: DEBUG nova.compute.manager [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Received event network-vif-plugged-e17d8ca6-05f9-4753-b2b4-d677044848ba {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 872.414892] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] Acquiring lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 872.415170] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 872.415321] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 872.415491] env[61439]: DEBUG nova.compute.manager [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] No waiting events found dispatching network-vif-plugged-e17d8ca6-05f9-4753-b2b4-d677044848ba {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 872.415668] env[61439]: WARNING nova.compute.manager [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Received unexpected event network-vif-plugged-e17d8ca6-05f9-4753-b2b4-d677044848ba for instance with vm_state building and task_state spawning. [ 872.415856] env[61439]: DEBUG nova.compute.manager [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Received event network-changed-e17d8ca6-05f9-4753-b2b4-d677044848ba {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 872.416018] env[61439]: DEBUG nova.compute.manager [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Refreshing instance network info cache due to event network-changed-e17d8ca6-05f9-4753-b2b4-d677044848ba. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 872.416226] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] Acquiring lock "refresh_cache-aeeb7c6c-7413-46b0-8632-c7224620e9b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 872.416380] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] Acquired lock "refresh_cache-aeeb7c6c-7413-46b0-8632-c7224620e9b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 872.416557] env[61439]: DEBUG nova.network.neutron [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Refreshing network info cache for port e17d8ca6-05f9-4753-b2b4-d677044848ba {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 872.756443] env[61439]: DEBUG nova.network.neutron [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Updated VIF entry in instance network info cache for port e17d8ca6-05f9-4753-b2b4-d677044848ba. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 872.756813] env[61439]: DEBUG nova.network.neutron [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Updating instance_info_cache with network_info: [{"id": "e17d8ca6-05f9-4753-b2b4-d677044848ba", "address": "fa:16:3e:1a:a3:80", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape17d8ca6-05", "ovs_interfaceid": "e17d8ca6-05f9-4753-b2b4-d677044848ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 872.766137] env[61439]: DEBUG oslo_concurrency.lockutils [req-ac45d394-16da-4b16-9898-5d4a37fa0826 req-46c6b4c7-c858-429c-8c14-212c7f4eaf12 service nova] Releasing lock "refresh_cache-aeeb7c6c-7413-46b0-8632-c7224620e9b2" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 872.775892] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987710, 'name': CreateVM_Task, 'duration_secs': 0.291007} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 872.776068] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 872.776672] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 872.776837] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 872.777148] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 872.777388] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fe6fd239-bbc9-4c8e-9bca-17b23c838162 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.781658] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for the task: (returnval){ [ 872.781658] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5225f4a3-2af4-d7aa-dbf8-be5742461fd9" [ 872.781658] env[61439]: _type = "Task" [ 872.781658] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 872.789316] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5225f4a3-2af4-d7aa-dbf8-be5742461fd9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 873.292905] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 873.293018] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 873.294030] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 882.316092] env[61439]: WARNING oslo_vmware.rw_handles [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 882.316092] env[61439]: ERROR oslo_vmware.rw_handles [ 882.316697] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 882.318637] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 882.318901] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Copying Virtual Disk [datastore2] vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/aaf80575-1139-4dee-8576-34bb69d67fce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 882.319189] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-380dc6fa-878c-443c-b426-6a98976e310d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.326341] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Waiting for the task: (returnval){ [ 882.326341] env[61439]: value = "task-987711" [ 882.326341] env[61439]: _type = "Task" [ 882.326341] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 882.335346] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Task: {'id': task-987711, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 882.836681] env[61439]: DEBUG oslo_vmware.exceptions [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 882.836972] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 882.837538] env[61439]: ERROR nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 882.837538] env[61439]: Faults: ['InvalidArgument'] [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] Traceback (most recent call last): [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] yield resources [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] self.driver.spawn(context, instance, image_meta, [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] self._vmops.spawn(context, instance, image_meta, injected_files, [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] self._fetch_image_if_missing(context, vi) [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] image_cache(vi, tmp_image_ds_loc) [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] vm_util.copy_virtual_disk( [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] session._wait_for_task(vmdk_copy_task) [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] return self.wait_for_task(task_ref) [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] return evt.wait() [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] result = hub.switch() [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] return self.greenlet.switch() [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] self.f(*self.args, **self.kw) [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] raise exceptions.translate_fault(task_info.error) [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] Faults: ['InvalidArgument'] [ 882.837538] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] [ 882.838316] env[61439]: INFO nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Terminating instance [ 882.839348] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 882.839580] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 882.839832] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ad1d93e9-4f88-4933-8c1a-3b0ab2d0cd37 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.842593] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquiring lock "refresh_cache-42ca8a89-5938-491b-b122-deac71d18505" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 882.842757] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquired lock "refresh_cache-42ca8a89-5938-491b-b122-deac71d18505" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 882.842924] env[61439]: DEBUG nova.network.neutron [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 882.850152] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 882.850334] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 882.851062] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c0c42239-65b0-4410-95a1-14cd3ef8741b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.862577] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Waiting for the task: (returnval){ [ 882.862577] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52d409f2-7b49-581e-b205-4c12051759c0" [ 882.862577] env[61439]: _type = "Task" [ 882.862577] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 882.869493] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52d409f2-7b49-581e-b205-4c12051759c0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 882.879361] env[61439]: DEBUG nova.network.neutron [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 883.009450] env[61439]: DEBUG nova.network.neutron [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 883.017990] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Releasing lock "refresh_cache-42ca8a89-5938-491b-b122-deac71d18505" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 883.018401] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 883.018594] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 883.019770] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f0c67f8-c598-4437-831a-7b312c8e15cd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.027949] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 883.028193] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d426d3ad-ea96-419e-a450-4b5309c94e94 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.061245] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 883.061465] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 883.061649] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Deleting the datastore file [datastore2] 42ca8a89-5938-491b-b122-deac71d18505 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 883.061907] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a8157426-ef77-430f-b61e-7a80eadc8359 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.068345] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Waiting for the task: (returnval){ [ 883.068345] env[61439]: value = "task-987713" [ 883.068345] env[61439]: _type = "Task" [ 883.068345] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 883.075992] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Task: {'id': task-987713, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 883.371455] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 883.371806] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Creating directory with path [datastore2] vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 883.371925] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-524e5319-1ce2-43d4-929b-aa797903ef2b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.383806] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Created directory with path [datastore2] vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 883.383993] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Fetch image to [datastore2] vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 883.384179] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 883.384908] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca45dcfe-210e-4a45-bfc6-e8e88ec4b689 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.391331] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd93c925-13b0-42ae-acc9-baf35f77303d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.400025] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c18b1c64-65e6-428e-82c6-c6cc9b37b54e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.430856] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee09a9d4-4bf7-41cc-b02c-15e963386013 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.436382] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a9d4eaf5-476b-40a6-af53-8fdc4abd7f36 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.454446] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 883.503129] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 883.562021] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 883.562213] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 883.577545] env[61439]: DEBUG oslo_vmware.api [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Task: {'id': task-987713, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.042035} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 883.577904] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 883.578114] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 883.578293] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 883.578469] env[61439]: INFO nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Took 0.56 seconds to destroy the instance on the hypervisor. [ 883.578708] env[61439]: DEBUG oslo.service.loopingcall [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 883.578906] env[61439]: DEBUG nova.compute.manager [-] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 883.580958] env[61439]: DEBUG nova.compute.claims [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 883.581144] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 883.581358] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 883.746928] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae0219f-6726-4a04-9749-a51ec88a5ec1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.754712] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcbff6cb-1f16-4949-b4c4-d0cbefca2140 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.784700] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a87d9c80-379f-4bf0-a2cb-df1a5981cf3a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.791280] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25fae37d-2458-4a06-aa82-e37a513bd73d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.803604] env[61439]: DEBUG nova.compute.provider_tree [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 883.813009] env[61439]: DEBUG nova.scheduler.client.report [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 883.826731] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.245s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 883.827275] env[61439]: ERROR nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 883.827275] env[61439]: Faults: ['InvalidArgument'] [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] Traceback (most recent call last): [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] self.driver.spawn(context, instance, image_meta, [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] self._vmops.spawn(context, instance, image_meta, injected_files, [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] self._fetch_image_if_missing(context, vi) [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] image_cache(vi, tmp_image_ds_loc) [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] vm_util.copy_virtual_disk( [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] session._wait_for_task(vmdk_copy_task) [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] return self.wait_for_task(task_ref) [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] return evt.wait() [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] result = hub.switch() [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] return self.greenlet.switch() [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] self.f(*self.args, **self.kw) [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] raise exceptions.translate_fault(task_info.error) [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] Faults: ['InvalidArgument'] [ 883.827275] env[61439]: ERROR nova.compute.manager [instance: 42ca8a89-5938-491b-b122-deac71d18505] [ 883.828071] env[61439]: DEBUG nova.compute.utils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 883.829304] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Build of instance 42ca8a89-5938-491b-b122-deac71d18505 was re-scheduled: A specified parameter was not correct: fileType [ 883.829304] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 883.829669] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 883.829983] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquiring lock "refresh_cache-42ca8a89-5938-491b-b122-deac71d18505" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 883.830168] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Acquired lock "refresh_cache-42ca8a89-5938-491b-b122-deac71d18505" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 883.830358] env[61439]: DEBUG nova.network.neutron [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 883.853801] env[61439]: DEBUG nova.network.neutron [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 883.965818] env[61439]: DEBUG nova.network.neutron [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 883.976763] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Releasing lock "refresh_cache-42ca8a89-5938-491b-b122-deac71d18505" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 883.977014] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 883.977229] env[61439]: DEBUG nova.compute.manager [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] [instance: 42ca8a89-5938-491b-b122-deac71d18505] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 884.057436] env[61439]: INFO nova.scheduler.client.report [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Deleted allocations for instance 42ca8a89-5938-491b-b122-deac71d18505 [ 884.077767] env[61439]: DEBUG oslo_concurrency.lockutils [None req-17e35825-abc3-4ce8-be80-288f7c0e539d tempest-ServerShowV254Test-774507648 tempest-ServerShowV254Test-774507648-project-member] Lock "42ca8a89-5938-491b-b122-deac71d18505" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 141.255s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 889.202093] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 889.202398] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 889.202508] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Cleaning up deleted instances {{(pid=61439) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 889.214326] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] There are 0 instances to clean {{(pid=61439) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 889.214574] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 889.214744] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Cleaning up deleted instances with incomplete migration {{(pid=61439) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 889.225102] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 890.231363] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 890.242719] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 890.242940] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 890.243119] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 890.243280] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 890.244366] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-909d2a34-0a2e-423f-b7c0-ca56b1093caa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.253131] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da6a9d98-ce46-4bc4-8049-435a5a06acb8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.266994] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-202f1d39-623a-46f4-82d9-a7d9a4c5a0b2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.273361] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d969c65-3f23-4f99-84eb-4fed24a2e20a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.303920] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181580MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 890.304143] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 890.304354] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 890.370054] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance bf9101c9-4072-4f72-8ac3-24b7a5b88b45 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.370227] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.370381] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance eeed1f83-89d2-4887-9e44-b269a2e295ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.370517] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance e51da790-9736-4181-9562-1a8f87895bd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.370641] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.370762] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 8089bd3f-47e7-4490-8bfc-a1d87bf559ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.370879] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance b954e159-4d89-4c61-a5bc-5e6c67cf278c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.370996] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 2b55d3f3-cff9-4e34-936e-ece6759cfd40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.371128] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 890.371321] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 890.371462] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 890.480379] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98a45388-e087-4b30-9f9f-ae0022257e06 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.488401] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c73120f1-5839-4c3b-a0f9-02c62319fdd9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.517745] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-120d58be-036c-43e4-bdcf-69d57550ed07 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.524798] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d849c58-4f6c-4f30-b35d-4c11b65be025 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 890.537720] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 890.546018] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 890.558957] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 890.559165] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.255s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 891.530752] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 891.530752] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 891.530752] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 891.573971] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.574134] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.574276] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.574410] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.574539] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.574666] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.574789] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.574913] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.575044] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 891.575179] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 891.575738] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 893.201625] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 893.201962] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 893.202025] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 893.202137] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 895.203631] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 896.201627] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 932.332015] env[61439]: WARNING oslo_vmware.rw_handles [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 932.332015] env[61439]: ERROR oslo_vmware.rw_handles [ 932.332668] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 932.334426] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 932.334695] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Copying Virtual Disk [datastore2] vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/3032fae6-a192-4085-903e-eea01055c360/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 932.335101] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0c39e3b8-6c33-4772-9482-1c855f83e396 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.342781] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Waiting for the task: (returnval){ [ 932.342781] env[61439]: value = "task-987714" [ 932.342781] env[61439]: _type = "Task" [ 932.342781] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 932.351427] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Task: {'id': task-987714, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 932.852619] env[61439]: DEBUG oslo_vmware.exceptions [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 932.852868] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 932.853470] env[61439]: ERROR nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 932.853470] env[61439]: Faults: ['InvalidArgument'] [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Traceback (most recent call last): [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] yield resources [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] self.driver.spawn(context, instance, image_meta, [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] self._vmops.spawn(context, instance, image_meta, injected_files, [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] self._fetch_image_if_missing(context, vi) [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] image_cache(vi, tmp_image_ds_loc) [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] vm_util.copy_virtual_disk( [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] session._wait_for_task(vmdk_copy_task) [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] return self.wait_for_task(task_ref) [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] return evt.wait() [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] result = hub.switch() [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] return self.greenlet.switch() [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] self.f(*self.args, **self.kw) [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] raise exceptions.translate_fault(task_info.error) [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Faults: ['InvalidArgument'] [ 932.853470] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] [ 932.854427] env[61439]: INFO nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Terminating instance [ 932.856106] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 932.856330] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 932.856858] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquiring lock "refresh_cache-bf9101c9-4072-4f72-8ac3-24b7a5b88b45" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 932.857029] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquired lock "refresh_cache-bf9101c9-4072-4f72-8ac3-24b7a5b88b45" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 932.857203] env[61439]: DEBUG nova.network.neutron [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 932.858147] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ef6e402a-eab5-4800-98ed-3a0fc37724e2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.867651] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 932.867830] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 932.868790] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8af4efdd-e4d5-4713-bff8-b6b2ced706b7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.873780] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Waiting for the task: (returnval){ [ 932.873780] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]526b7829-8688-08c9-619d-3028a9c28f29" [ 932.873780] env[61439]: _type = "Task" [ 932.873780] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 932.881206] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]526b7829-8688-08c9-619d-3028a9c28f29, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 932.884552] env[61439]: DEBUG nova.network.neutron [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 932.945498] env[61439]: DEBUG nova.network.neutron [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 932.954286] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Releasing lock "refresh_cache-bf9101c9-4072-4f72-8ac3-24b7a5b88b45" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 932.954682] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 932.954871] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 932.955918] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-279d57f9-096d-495b-8bd7-217c3cd40ee4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.963566] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 932.963781] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-634ddd7e-ca39-4aaf-9e03-d6e7d1352483 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.988390] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 932.988606] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 932.988785] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Deleting the datastore file [datastore2] bf9101c9-4072-4f72-8ac3-24b7a5b88b45 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 932.989308] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ddbfd57a-070f-43a5-8b48-466d19134584 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.995075] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Waiting for the task: (returnval){ [ 932.995075] env[61439]: value = "task-987716" [ 932.995075] env[61439]: _type = "Task" [ 932.995075] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 933.002504] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Task: {'id': task-987716, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 933.384566] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 933.384904] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Creating directory with path [datastore2] vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 933.385035] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e98c4c92-4e4d-4a41-8407-080fd0a1bd30 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.396372] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Created directory with path [datastore2] vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 933.396559] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Fetch image to [datastore2] vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 933.396730] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 933.397456] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29d18448-b774-4c92-8adc-695d69bb2722 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.404150] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cc8ea6d-3e45-4881-bef8-e7756a09759d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.412955] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b746c7c9-bf01-48b1-aab8-6ed36b0a4e28 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.444522] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8792bc8-a93d-40ac-943f-c4c555373dac {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.450288] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c6f93dec-8a33-476a-8cd3-26e7cb4a4a41 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.477302] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 933.505476] env[61439]: DEBUG oslo_vmware.api [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Task: {'id': task-987716, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.045948} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 933.505733] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 933.505918] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 933.506241] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 933.506271] env[61439]: INFO nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Took 0.55 seconds to destroy the instance on the hypervisor. [ 933.506492] env[61439]: DEBUG oslo.service.loopingcall [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 933.506700] env[61439]: DEBUG nova.compute.manager [-] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 933.508818] env[61439]: DEBUG nova.compute.claims [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 933.508991] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 933.509239] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 933.549379] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 933.624534] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 933.624758] env[61439]: DEBUG oslo_vmware.rw_handles [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 933.630165] env[61439]: DEBUG nova.scheduler.client.report [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Refreshing inventories for resource provider b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 933.646100] env[61439]: DEBUG nova.scheduler.client.report [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Updating ProviderTree inventory for provider b35c9fce-988b-4acc-b175-83b202107c41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 933.646327] env[61439]: DEBUG nova.compute.provider_tree [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Updating inventory in ProviderTree for provider b35c9fce-988b-4acc-b175-83b202107c41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 933.658467] env[61439]: DEBUG nova.scheduler.client.report [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Refreshing aggregate associations for resource provider b35c9fce-988b-4acc-b175-83b202107c41, aggregates: None {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 933.677429] env[61439]: DEBUG nova.scheduler.client.report [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Refreshing trait associations for resource provider b35c9fce-988b-4acc-b175-83b202107c41, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 933.799218] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66883dbd-b262-4619-9282-2bccb143fb45 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.806583] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f07fe09c-591e-44f8-8298-f2a6b660e353 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.835485] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8207bd0-d8cc-4b13-a889-257ff3e451d2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.842849] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2bc9d80-eff3-4de4-9edc-7b73a5422b85 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.855807] env[61439]: DEBUG nova.compute.provider_tree [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 933.865104] env[61439]: DEBUG nova.scheduler.client.report [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 933.881492] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.372s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 933.882064] env[61439]: ERROR nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 933.882064] env[61439]: Faults: ['InvalidArgument'] [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Traceback (most recent call last): [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] self.driver.spawn(context, instance, image_meta, [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] self._vmops.spawn(context, instance, image_meta, injected_files, [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] self._fetch_image_if_missing(context, vi) [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] image_cache(vi, tmp_image_ds_loc) [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] vm_util.copy_virtual_disk( [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] session._wait_for_task(vmdk_copy_task) [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] return self.wait_for_task(task_ref) [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] return evt.wait() [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] result = hub.switch() [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] return self.greenlet.switch() [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] self.f(*self.args, **self.kw) [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] raise exceptions.translate_fault(task_info.error) [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Faults: ['InvalidArgument'] [ 933.882064] env[61439]: ERROR nova.compute.manager [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] [ 933.882991] env[61439]: DEBUG nova.compute.utils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 933.884273] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Build of instance bf9101c9-4072-4f72-8ac3-24b7a5b88b45 was re-scheduled: A specified parameter was not correct: fileType [ 933.884273] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 933.884656] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 933.884896] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquiring lock "refresh_cache-bf9101c9-4072-4f72-8ac3-24b7a5b88b45" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 933.885069] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Acquired lock "refresh_cache-bf9101c9-4072-4f72-8ac3-24b7a5b88b45" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 933.885237] env[61439]: DEBUG nova.network.neutron [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 933.917468] env[61439]: DEBUG nova.network.neutron [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 934.039168] env[61439]: DEBUG nova.network.neutron [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 934.048713] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Releasing lock "refresh_cache-bf9101c9-4072-4f72-8ac3-24b7a5b88b45" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 934.048938] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 934.049150] env[61439]: DEBUG nova.compute.manager [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] [instance: bf9101c9-4072-4f72-8ac3-24b7a5b88b45] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 934.142034] env[61439]: INFO nova.scheduler.client.report [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Deleted allocations for instance bf9101c9-4072-4f72-8ac3-24b7a5b88b45 [ 934.161325] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f39ff465-836e-48cf-9c6e-b3ad3fbeaba5 tempest-ServerDiagnosticsV248Test-537127248 tempest-ServerDiagnosticsV248Test-537127248-project-member] Lock "bf9101c9-4072-4f72-8ac3-24b7a5b88b45" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 180.226s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 937.333475] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquiring lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.333751] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.343981] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 937.394259] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.394521] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.396120] env[61439]: INFO nova.compute.claims [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 937.543856] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bf2462b-6eea-4b1c-8033-9af534f0fc01 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.551577] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48beef98-5217-4a36-bd37-e1b3bf425b61 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.586398] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e40c4904-ecba-4cc6-88c2-b5f83a3b95a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.594066] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-512ff45b-dc47-43aa-b798-30ab45d1f573 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.607629] env[61439]: DEBUG nova.compute.provider_tree [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 937.615618] env[61439]: DEBUG nova.scheduler.client.report [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 937.628237] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 937.628697] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 937.661878] env[61439]: DEBUG nova.compute.utils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 937.663890] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 937.664121] env[61439]: DEBUG nova.network.neutron [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 937.671939] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 937.732909] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 937.757120] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 937.757366] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 937.757527] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 937.757711] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 937.757861] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 937.758022] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 937.758238] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 937.758401] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 937.758577] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 937.758752] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 937.758923] env[61439]: DEBUG nova.virt.hardware [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 937.759785] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4985efd4-d10e-49b0-a7c3-4b11eb6572ab {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.767674] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1318ba9-9b80-4f4f-baf0-0de3a68fffe0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.772924] env[61439]: DEBUG nova.policy [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d672fb4dc3048a4a1bc300f0d91f1b7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b72c7e53e97e46efa6a0fd8ccf211f1d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 938.111908] env[61439]: DEBUG nova.network.neutron [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Successfully created port: 6e5994c1-39cb-42b6-a665-678b04342c84 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 939.104883] env[61439]: DEBUG nova.compute.manager [req-579b40e2-b506-4bc1-9f7e-1b25a4884038 req-fb62e470-d703-4ba2-b105-9c141bc36c39 service nova] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Received event network-vif-plugged-6e5994c1-39cb-42b6-a665-678b04342c84 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 939.105129] env[61439]: DEBUG oslo_concurrency.lockutils [req-579b40e2-b506-4bc1-9f7e-1b25a4884038 req-fb62e470-d703-4ba2-b105-9c141bc36c39 service nova] Acquiring lock "7b48729f-86a7-4b53-ad11-ef8a929ec947-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 939.105341] env[61439]: DEBUG oslo_concurrency.lockutils [req-579b40e2-b506-4bc1-9f7e-1b25a4884038 req-fb62e470-d703-4ba2-b105-9c141bc36c39 service nova] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 939.105513] env[61439]: DEBUG oslo_concurrency.lockutils [req-579b40e2-b506-4bc1-9f7e-1b25a4884038 req-fb62e470-d703-4ba2-b105-9c141bc36c39 service nova] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 939.105684] env[61439]: DEBUG nova.compute.manager [req-579b40e2-b506-4bc1-9f7e-1b25a4884038 req-fb62e470-d703-4ba2-b105-9c141bc36c39 service nova] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] No waiting events found dispatching network-vif-plugged-6e5994c1-39cb-42b6-a665-678b04342c84 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 939.105852] env[61439]: WARNING nova.compute.manager [req-579b40e2-b506-4bc1-9f7e-1b25a4884038 req-fb62e470-d703-4ba2-b105-9c141bc36c39 service nova] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Received unexpected event network-vif-plugged-6e5994c1-39cb-42b6-a665-678b04342c84 for instance with vm_state building and task_state spawning. [ 939.182817] env[61439]: DEBUG nova.network.neutron [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Successfully updated port: 6e5994c1-39cb-42b6-a665-678b04342c84 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 939.194176] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquiring lock "refresh_cache-7b48729f-86a7-4b53-ad11-ef8a929ec947" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 939.194416] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquired lock "refresh_cache-7b48729f-86a7-4b53-ad11-ef8a929ec947" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 939.194655] env[61439]: DEBUG nova.network.neutron [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 939.299299] env[61439]: DEBUG nova.network.neutron [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 939.453975] env[61439]: DEBUG nova.network.neutron [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Updating instance_info_cache with network_info: [{"id": "6e5994c1-39cb-42b6-a665-678b04342c84", "address": "fa:16:3e:89:7e:2c", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.130", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e5994c1-39", "ovs_interfaceid": "6e5994c1-39cb-42b6-a665-678b04342c84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 939.466868] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Releasing lock "refresh_cache-7b48729f-86a7-4b53-ad11-ef8a929ec947" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 939.467116] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Instance network_info: |[{"id": "6e5994c1-39cb-42b6-a665-678b04342c84", "address": "fa:16:3e:89:7e:2c", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.130", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e5994c1-39", "ovs_interfaceid": "6e5994c1-39cb-42b6-a665-678b04342c84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 939.467485] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:89:7e:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69e41c97-4d75-4041-ae71-321e7e9d480b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6e5994c1-39cb-42b6-a665-678b04342c84', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 939.475558] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Creating folder: Project (b72c7e53e97e46efa6a0fd8ccf211f1d). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 939.476063] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-45b507f8-2fec-45df-8900-e8422e5a6c93 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.488300] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Created folder: Project (b72c7e53e97e46efa6a0fd8ccf211f1d) in parent group-v221281. [ 939.488485] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Creating folder: Instances. Parent ref: group-v221328. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 939.488707] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-356000a7-848f-46e1-8872-fae14e655af9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.498942] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Created folder: Instances in parent group-v221328. [ 939.499190] env[61439]: DEBUG oslo.service.loopingcall [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 939.499369] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 939.499564] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c2609411-da3c-4ff1-b20f-e02f8d3cd06a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 939.518372] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 939.518372] env[61439]: value = "task-987719" [ 939.518372] env[61439]: _type = "Task" [ 939.518372] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 939.528524] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987719, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 940.028620] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987719, 'name': CreateVM_Task, 'duration_secs': 0.323814} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 940.031053] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 940.031053] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 940.031053] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 940.031053] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 940.031053] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-acf38049-a956-40b9-a0a7-e04aed9d62c0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 940.034875] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Waiting for the task: (returnval){ [ 940.034875] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52334167-019e-58fc-76a0-b408b93f51d6" [ 940.034875] env[61439]: _type = "Task" [ 940.034875] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 940.042032] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52334167-019e-58fc-76a0-b408b93f51d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 940.545628] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 940.545957] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 940.546082] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 941.126127] env[61439]: DEBUG nova.compute.manager [req-404983aa-df16-4d12-b0be-4ed773691dab req-684fe7b0-3a1d-44dc-bfe6-2f6936b0258f service nova] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Received event network-changed-6e5994c1-39cb-42b6-a665-678b04342c84 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 941.126409] env[61439]: DEBUG nova.compute.manager [req-404983aa-df16-4d12-b0be-4ed773691dab req-684fe7b0-3a1d-44dc-bfe6-2f6936b0258f service nova] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Refreshing instance network info cache due to event network-changed-6e5994c1-39cb-42b6-a665-678b04342c84. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 941.126589] env[61439]: DEBUG oslo_concurrency.lockutils [req-404983aa-df16-4d12-b0be-4ed773691dab req-684fe7b0-3a1d-44dc-bfe6-2f6936b0258f service nova] Acquiring lock "refresh_cache-7b48729f-86a7-4b53-ad11-ef8a929ec947" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 941.126694] env[61439]: DEBUG oslo_concurrency.lockutils [req-404983aa-df16-4d12-b0be-4ed773691dab req-684fe7b0-3a1d-44dc-bfe6-2f6936b0258f service nova] Acquired lock "refresh_cache-7b48729f-86a7-4b53-ad11-ef8a929ec947" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 941.126856] env[61439]: DEBUG nova.network.neutron [req-404983aa-df16-4d12-b0be-4ed773691dab req-684fe7b0-3a1d-44dc-bfe6-2f6936b0258f service nova] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Refreshing network info cache for port 6e5994c1-39cb-42b6-a665-678b04342c84 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 941.381703] env[61439]: DEBUG nova.network.neutron [req-404983aa-df16-4d12-b0be-4ed773691dab req-684fe7b0-3a1d-44dc-bfe6-2f6936b0258f service nova] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Updated VIF entry in instance network info cache for port 6e5994c1-39cb-42b6-a665-678b04342c84. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 941.382067] env[61439]: DEBUG nova.network.neutron [req-404983aa-df16-4d12-b0be-4ed773691dab req-684fe7b0-3a1d-44dc-bfe6-2f6936b0258f service nova] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Updating instance_info_cache with network_info: [{"id": "6e5994c1-39cb-42b6-a665-678b04342c84", "address": "fa:16:3e:89:7e:2c", "network": {"id": "e740c654-c12e-49cb-af60-ccd4008d5a05", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.130", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d8f76046251a4a44a275999df0a57832", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69e41c97-4d75-4041-ae71-321e7e9d480b", "external-id": "nsx-vlan-transportzone-483", "segmentation_id": 483, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e5994c1-39", "ovs_interfaceid": "6e5994c1-39cb-42b6-a665-678b04342c84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 941.391325] env[61439]: DEBUG oslo_concurrency.lockutils [req-404983aa-df16-4d12-b0be-4ed773691dab req-684fe7b0-3a1d-44dc-bfe6-2f6936b0258f service nova] Releasing lock "refresh_cache-7b48729f-86a7-4b53-ad11-ef8a929ec947" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 951.202268] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 951.202569] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 951.213450] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 951.213665] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 951.213832] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 951.213989] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 951.215082] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a9bef9d-8e00-4f00-9336-6bd2c1ab4bab {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.224044] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1e67a62-7d9f-4fe2-8031-c1905c0a815b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.237851] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89f3a624-6340-4d26-bf0f-7824c3c2373a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.244103] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eaf7966-c03c-4445-ad36-0f8ffcfcbc61 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.275527] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181583MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 951.275687] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 951.275887] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 951.344572] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.344748] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance eeed1f83-89d2-4887-9e44-b269a2e295ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.344918] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance e51da790-9736-4181-9562-1a8f87895bd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.345066] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.345196] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 8089bd3f-47e7-4490-8bfc-a1d87bf559ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.345320] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance b954e159-4d89-4c61-a5bc-5e6c67cf278c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.345443] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 2b55d3f3-cff9-4e34-936e-ece6759cfd40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.345562] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.345680] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 951.345875] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 951.346125] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 951.459175] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0d0080a-bf72-4c52-8ad3-f639196284bb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.466960] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18017061-ab7b-4acf-9d71-06d4e6ed8066 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.495862] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccdada4a-c719-4488-acc3-b78f3964ab7e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.502856] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cffc8c37-ed93-48fb-8b24-255bf0285a9d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.515709] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 951.523475] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 951.535822] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 951.536012] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.539891] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 952.539891] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 952.539891] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 952.557035] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.557165] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.557275] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.557409] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.557535] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.557659] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.558180] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.558180] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.558180] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 952.558316] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 953.201559] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 954.197362] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 954.197615] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 954.218060] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 954.218060] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 954.218060] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 955.202157] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 956.201811] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 982.608845] env[61439]: WARNING oslo_vmware.rw_handles [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 982.608845] env[61439]: ERROR oslo_vmware.rw_handles [ 982.609432] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 982.611173] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 982.611432] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Copying Virtual Disk [datastore2] vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/aff2eb74-e3e6-46cc-b643-977ed26e0065/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 982.611824] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bb0647c8-78b8-4a9e-b41a-3495b3bd8d51 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.620508] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Waiting for the task: (returnval){ [ 982.620508] env[61439]: value = "task-987720" [ 982.620508] env[61439]: _type = "Task" [ 982.620508] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 982.628520] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Task: {'id': task-987720, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 983.130213] env[61439]: DEBUG oslo_vmware.exceptions [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 983.130502] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 983.131182] env[61439]: ERROR nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 983.131182] env[61439]: Faults: ['InvalidArgument'] [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Traceback (most recent call last): [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] yield resources [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] self.driver.spawn(context, instance, image_meta, [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] self._fetch_image_if_missing(context, vi) [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] image_cache(vi, tmp_image_ds_loc) [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] vm_util.copy_virtual_disk( [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] session._wait_for_task(vmdk_copy_task) [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] return self.wait_for_task(task_ref) [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] return evt.wait() [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] result = hub.switch() [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] return self.greenlet.switch() [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] self.f(*self.args, **self.kw) [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] raise exceptions.translate_fault(task_info.error) [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Faults: ['InvalidArgument'] [ 983.131182] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] [ 983.131960] env[61439]: INFO nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Terminating instance [ 983.133013] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 983.133223] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 983.133460] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-df98cb5a-85fb-46c9-b16d-57adb4d80418 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.135645] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquiring lock "refresh_cache-eeed1f83-89d2-4887-9e44-b269a2e295ae" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 983.135810] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquired lock "refresh_cache-eeed1f83-89d2-4887-9e44-b269a2e295ae" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 983.135980] env[61439]: DEBUG nova.network.neutron [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 983.142988] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 983.142988] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 983.144191] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b8512f11-4717-4c15-b6a7-d8a878528eea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.151189] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Waiting for the task: (returnval){ [ 983.151189] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52067660-bb75-02e2-f00e-3c1980430df3" [ 983.151189] env[61439]: _type = "Task" [ 983.151189] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 983.158978] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52067660-bb75-02e2-f00e-3c1980430df3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 983.167040] env[61439]: DEBUG nova.network.neutron [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 983.227867] env[61439]: DEBUG nova.network.neutron [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 983.236317] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Releasing lock "refresh_cache-eeed1f83-89d2-4887-9e44-b269a2e295ae" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 983.236695] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 983.236888] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 983.237920] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19c252ce-46df-4dce-b14d-688fae7d39a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.245633] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 983.245928] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bd33a02f-b26c-4b18-b044-149c98d80a7f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.273867] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 983.274094] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 983.274281] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Deleting the datastore file [datastore2] eeed1f83-89d2-4887-9e44-b269a2e295ae {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 983.274514] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-99c9f3a0-1ae6-4d63-983c-d77ebcc15ac8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.280372] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Waiting for the task: (returnval){ [ 983.280372] env[61439]: value = "task-987722" [ 983.280372] env[61439]: _type = "Task" [ 983.280372] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 983.287396] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Task: {'id': task-987722, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 983.661586] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 983.661915] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Creating directory with path [datastore2] vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 983.662140] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c8ac77a8-739f-4b61-978d-1bf7a38c14fa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.672820] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Created directory with path [datastore2] vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 983.672998] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Fetch image to [datastore2] vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 983.673184] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 983.673856] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3151ffd7-59e1-43f5-9f5b-e40901726ce1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.681378] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00fd609d-d14f-467a-8233-81a5292afd5f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.690151] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33b8c926-d147-423e-8a72-81b42a2130b8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.720799] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-903ed04b-078b-4147-8c4d-ecf50a0b65ab {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.726513] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7fb12cdb-be8b-4b45-b5f6-7969f27b213d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.759686] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 983.792703] env[61439]: DEBUG oslo_vmware.api [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Task: {'id': task-987722, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041749} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 983.792958] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 983.793160] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 983.793435] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 983.793539] env[61439]: INFO nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Took 0.56 seconds to destroy the instance on the hypervisor. [ 983.793815] env[61439]: DEBUG oslo.service.loopingcall [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 983.793977] env[61439]: DEBUG nova.compute.manager [-] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 983.796988] env[61439]: DEBUG nova.compute.claims [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 983.796988] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 983.797202] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 983.813857] env[61439]: DEBUG oslo_vmware.rw_handles [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 983.876551] env[61439]: DEBUG oslo_vmware.rw_handles [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 983.876551] env[61439]: DEBUG oslo_vmware.rw_handles [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 983.982304] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9af37e6f-20a4-418c-98e1-e15f711a1c7a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.989625] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7769a430-d418-4072-a063-a09214a14c0f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.020011] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e5f2157-a982-44ce-b4e4-b73fc8bb4528 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.026541] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6795475-a740-41a1-aba6-f67a2c678a01 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.038952] env[61439]: DEBUG nova.compute.provider_tree [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 984.046746] env[61439]: DEBUG nova.scheduler.client.report [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 984.060712] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.263s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 984.061241] env[61439]: ERROR nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 984.061241] env[61439]: Faults: ['InvalidArgument'] [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Traceback (most recent call last): [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] self.driver.spawn(context, instance, image_meta, [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] self._fetch_image_if_missing(context, vi) [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] image_cache(vi, tmp_image_ds_loc) [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] vm_util.copy_virtual_disk( [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] session._wait_for_task(vmdk_copy_task) [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] return self.wait_for_task(task_ref) [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] return evt.wait() [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] result = hub.switch() [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] return self.greenlet.switch() [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] self.f(*self.args, **self.kw) [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] raise exceptions.translate_fault(task_info.error) [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Faults: ['InvalidArgument'] [ 984.061241] env[61439]: ERROR nova.compute.manager [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] [ 984.062032] env[61439]: DEBUG nova.compute.utils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 984.063534] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Build of instance eeed1f83-89d2-4887-9e44-b269a2e295ae was re-scheduled: A specified parameter was not correct: fileType [ 984.063534] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 984.063917] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 984.064154] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquiring lock "refresh_cache-eeed1f83-89d2-4887-9e44-b269a2e295ae" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 984.064305] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Acquired lock "refresh_cache-eeed1f83-89d2-4887-9e44-b269a2e295ae" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 984.064466] env[61439]: DEBUG nova.network.neutron [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 984.089107] env[61439]: DEBUG nova.network.neutron [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 984.231456] env[61439]: DEBUG nova.network.neutron [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 984.241728] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Releasing lock "refresh_cache-eeed1f83-89d2-4887-9e44-b269a2e295ae" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 984.241944] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 984.242146] env[61439]: DEBUG nova.compute.manager [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] [instance: eeed1f83-89d2-4887-9e44-b269a2e295ae] Skipping network deallocation for instance since networking was not requested. {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 984.327464] env[61439]: INFO nova.scheduler.client.report [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Deleted allocations for instance eeed1f83-89d2-4887-9e44-b269a2e295ae [ 984.347405] env[61439]: DEBUG oslo_concurrency.lockutils [None req-f7fdaaad-da75-4239-ade5-65b6d66b06b5 tempest-ServersAaction247Test-2099232595 tempest-ServersAaction247Test-2099232595-project-member] Lock "eeed1f83-89d2-4887-9e44-b269a2e295ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 127.432s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1011.203716] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1011.215661] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1011.215897] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1011.216083] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1011.216264] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1011.217381] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ba870db-47a7-4060-a049-5f719377941d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.226187] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39821738-cd5e-443b-8557-6b2067fd97f3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.240173] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2509020-d3e5-42f9-a710-7d4b34c88a58 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.246441] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7431142-882a-48ae-b8fe-581383e4bd7c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.276464] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181572MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1011.276656] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1011.276821] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1011.347367] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1011.347536] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance e51da790-9736-4181-9562-1a8f87895bd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1011.347667] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1011.347828] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 8089bd3f-47e7-4490-8bfc-a1d87bf559ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1011.347949] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance b954e159-4d89-4c61-a5bc-5e6c67cf278c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1011.348075] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 2b55d3f3-cff9-4e34-936e-ece6759cfd40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1011.348210] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1011.348332] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1011.348576] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1011.348763] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1011.450558] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b01a537a-ec8f-4f00-a4b2-13eba744d0b9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.458842] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41795a2b-015b-4037-a649-6d1f8b662d4e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.488630] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95771869-85de-4fec-8c87-e98c8ca402ab {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.495784] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e854669-e796-4c2b-ac0c-6aac26d6749f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.510238] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1011.517964] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1011.531103] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1011.531298] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1013.529651] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1014.197585] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1014.201537] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1014.201537] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1014.201617] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1014.219353] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1014.219509] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1014.219646] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1014.219772] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1014.219896] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1014.220031] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1014.220160] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1014.220280] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1014.220402] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1014.220855] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1014.221059] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1014.221224] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1014.221357] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1016.202238] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1016.202510] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1032.788955] env[61439]: WARNING oslo_vmware.rw_handles [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1032.788955] env[61439]: ERROR oslo_vmware.rw_handles [ 1032.788955] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1032.790222] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1032.790382] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Copying Virtual Disk [datastore2] vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/4e9175c7-7e78-4ffb-9740-1fa45a589454/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1032.790608] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ecf6b8f9-84c3-4918-bbf5-1e9ae0450a14 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.800227] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Waiting for the task: (returnval){ [ 1032.800227] env[61439]: value = "task-987723" [ 1032.800227] env[61439]: _type = "Task" [ 1032.800227] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1032.808117] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Task: {'id': task-987723, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1033.310020] env[61439]: DEBUG oslo_vmware.exceptions [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1033.310020] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1033.310386] env[61439]: ERROR nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1033.310386] env[61439]: Faults: ['InvalidArgument'] [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Traceback (most recent call last): [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] yield resources [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] self.driver.spawn(context, instance, image_meta, [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] self._fetch_image_if_missing(context, vi) [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] image_cache(vi, tmp_image_ds_loc) [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] vm_util.copy_virtual_disk( [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] session._wait_for_task(vmdk_copy_task) [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] return self.wait_for_task(task_ref) [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] return evt.wait() [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] result = hub.switch() [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] return self.greenlet.switch() [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] self.f(*self.args, **self.kw) [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] raise exceptions.translate_fault(task_info.error) [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Faults: ['InvalidArgument'] [ 1033.310386] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] [ 1033.311172] env[61439]: INFO nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Terminating instance [ 1033.312253] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1033.312483] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1033.312726] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0fa738db-a83b-41a6-b71a-ecc7374e80a9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.314970] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1033.315173] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1033.315859] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d091e2a1-dcba-41a4-b535-86905d9b8ae7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.322824] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1033.323762] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9e571a55-6263-4724-b947-d80b21b52f13 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.325092] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1033.325268] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1033.325953] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6a401115-b53e-4553-9c39-8b860f20f0d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.330808] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for the task: (returnval){ [ 1033.330808] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]523a1da9-bd09-f49d-bc5b-3955cdf5f398" [ 1033.330808] env[61439]: _type = "Task" [ 1033.330808] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1033.342126] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]523a1da9-bd09-f49d-bc5b-3955cdf5f398, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1033.400307] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1033.400536] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1033.400668] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Deleting the datastore file [datastore2] d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1033.400937] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e5ce46ac-158c-4e5a-a9f7-a8f07ae5a8ff {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.407019] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Waiting for the task: (returnval){ [ 1033.407019] env[61439]: value = "task-987725" [ 1033.407019] env[61439]: _type = "Task" [ 1033.407019] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1033.414211] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Task: {'id': task-987725, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1033.841061] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1033.841418] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Creating directory with path [datastore2] vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1033.841538] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fcc10e72-177e-41c4-adb2-7fe1e3ae8dd5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.853378] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Created directory with path [datastore2] vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1033.853581] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Fetch image to [datastore2] vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1033.853754] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1033.854484] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-304140d1-ec10-4be4-b9e9-511c03cd1b04 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.860905] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cc0da0a-9c71-4595-9462-0203b9cad4fc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.869640] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e0f3f07-cf86-4421-839a-a61ecf49fc8b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.900116] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5a37d99-38a0-4822-81fa-d80c429500ef {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.905873] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2b790d7b-a92a-4572-aef5-b5269e851e43 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.914501] env[61439]: DEBUG oslo_vmware.api [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Task: {'id': task-987725, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078183} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1033.914726] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1033.914906] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1033.915088] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1033.915265] env[61439]: INFO nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1033.917254] env[61439]: DEBUG nova.compute.claims [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1033.917426] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1033.917637] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1033.931461] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1033.980503] env[61439]: DEBUG oslo_vmware.rw_handles [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1034.039787] env[61439]: DEBUG oslo_vmware.rw_handles [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1034.039970] env[61439]: DEBUG oslo_vmware.rw_handles [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1034.107468] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4a30e6b-91fb-4557-87fd-ac97bcec8643 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.114671] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8279f38f-5a71-4604-850e-74451913869a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.143746] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bf0d192-9e8b-4607-996e-31f93e031c7d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.150300] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36f782b0-46d8-4a82-b433-623bf6a319c4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.163982] env[61439]: DEBUG nova.compute.provider_tree [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1034.172141] env[61439]: DEBUG nova.scheduler.client.report [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1034.184751] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.267s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1034.185253] env[61439]: ERROR nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1034.185253] env[61439]: Faults: ['InvalidArgument'] [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Traceback (most recent call last): [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] self.driver.spawn(context, instance, image_meta, [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] self._fetch_image_if_missing(context, vi) [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] image_cache(vi, tmp_image_ds_loc) [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] vm_util.copy_virtual_disk( [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] session._wait_for_task(vmdk_copy_task) [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] return self.wait_for_task(task_ref) [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] return evt.wait() [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] result = hub.switch() [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] return self.greenlet.switch() [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] self.f(*self.args, **self.kw) [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] raise exceptions.translate_fault(task_info.error) [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Faults: ['InvalidArgument'] [ 1034.185253] env[61439]: ERROR nova.compute.manager [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] [ 1034.185938] env[61439]: DEBUG nova.compute.utils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1034.187236] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Build of instance d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d was re-scheduled: A specified parameter was not correct: fileType [ 1034.187236] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1034.187616] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1034.187785] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1034.187956] env[61439]: DEBUG nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1034.188131] env[61439]: DEBUG nova.network.neutron [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1034.582316] env[61439]: DEBUG nova.network.neutron [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1034.594620] env[61439]: INFO nova.compute.manager [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] [instance: d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d] Took 0.41 seconds to deallocate network for instance. [ 1034.712433] env[61439]: INFO nova.scheduler.client.report [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Deleted allocations for instance d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d [ 1034.733554] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0a63ee0a-42b4-4edf-a643-bd214151c0e5 tempest-ServerTagsTestJSON-568839792 tempest-ServerTagsTestJSON-568839792-project-member] Lock "d9325bb4-cb9d-4ce5-9eea-f74af6a85a2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 178.984s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.732184] env[61439]: DEBUG oslo_concurrency.lockutils [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "e51da790-9736-4181-9562-1a8f87895bd2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1056.694319] env[61439]: DEBUG oslo_concurrency.lockutils [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1056.873699] env[61439]: DEBUG oslo_concurrency.lockutils [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquiring lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.647109] env[61439]: DEBUG oslo_concurrency.lockutils [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1064.744328] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1073.203615] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1073.226818] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1073.226818] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1073.226967] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1073.227138] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1073.228512] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25588f60-5f67-4f5c-be61-7293004fc5d4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.239014] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b66b8fd5-ee5d-457e-a94d-ca5c7cec3a29 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.251518] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b82d4ef5-ea9d-4c79-bbb8-5a7da9ecf4de {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.257733] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b10cd1a8-4c1a-4e00-89a9-e1d6a8056af1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.287548] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181564MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1073.287684] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1073.287877] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1073.387062] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance e51da790-9736-4181-9562-1a8f87895bd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1073.387062] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1073.387062] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 8089bd3f-47e7-4490-8bfc-a1d87bf559ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1073.387062] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance b954e159-4d89-4c61-a5bc-5e6c67cf278c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1073.387062] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 2b55d3f3-cff9-4e34-936e-ece6759cfd40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1073.387062] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1073.387062] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1073.387496] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1073.387496] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1073.475125] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46a12109-3c99-4c55-875e-0cdd967ae334 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.483069] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7bf3f27-3bd8-41d6-b3e1-a621cee891d4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.512024] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d71da3f-e947-4f11-accc-aab6e2ac5f54 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.518894] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a75fabc-b224-42ab-8766-e2118c58c475 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.531677] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1073.539261] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1073.557326] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1073.557517] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1074.556547] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.197391] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.201060] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.201198] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1075.201322] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1075.217402] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1075.217563] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1075.217695] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1075.217820] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1075.217943] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1075.218079] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1075.218203] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1075.218325] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1075.218767] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.218948] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1076.201550] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1076.201947] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1076.201947] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1077.199191] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1078.201564] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1079.103278] env[61439]: WARNING oslo_vmware.rw_handles [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1079.103278] env[61439]: ERROR oslo_vmware.rw_handles [ 1079.103964] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1079.105585] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1079.105821] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Copying Virtual Disk [datastore2] vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/db111b4a-3bb2-411a-b901-ce9ffeee652a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1079.106119] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-45b1442e-3a02-458b-9f51-fe218216423d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.114611] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for the task: (returnval){ [ 1079.114611] env[61439]: value = "task-987726" [ 1079.114611] env[61439]: _type = "Task" [ 1079.114611] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.122605] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': task-987726, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.625327] env[61439]: DEBUG oslo_vmware.exceptions [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1079.625644] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1079.626132] env[61439]: ERROR nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1079.626132] env[61439]: Faults: ['InvalidArgument'] [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] Traceback (most recent call last): [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] yield resources [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] self.driver.spawn(context, instance, image_meta, [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] self._fetch_image_if_missing(context, vi) [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] image_cache(vi, tmp_image_ds_loc) [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] vm_util.copy_virtual_disk( [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] session._wait_for_task(vmdk_copy_task) [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] return self.wait_for_task(task_ref) [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] return evt.wait() [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] result = hub.switch() [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] return self.greenlet.switch() [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] self.f(*self.args, **self.kw) [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] raise exceptions.translate_fault(task_info.error) [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] Faults: ['InvalidArgument'] [ 1079.626132] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] [ 1079.627038] env[61439]: INFO nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Terminating instance [ 1079.628018] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1079.628232] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1079.628463] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6d82db46-10ca-43d4-9e9e-61da1d80ac5e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.631758] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1079.631961] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1079.632712] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bcbe81f-f8d0-4b98-9521-54b6bac77a68 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.639423] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1079.639627] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0030e6aa-c002-47bf-99a2-f8090b496f6d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.641683] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1079.641852] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1079.642794] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7bf9a4c6-de77-493f-9acc-4d0d8c06683f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.647193] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for the task: (returnval){ [ 1079.647193] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]527953ae-ad77-990e-7b4a-849c798372cc" [ 1079.647193] env[61439]: _type = "Task" [ 1079.647193] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.658824] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]527953ae-ad77-990e-7b4a-849c798372cc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.709927] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1079.710170] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1079.710402] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Deleting the datastore file [datastore2] e51da790-9736-4181-9562-1a8f87895bd2 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1079.710653] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a566b557-6f90-4cec-9728-b44aa5f7660d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.717258] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for the task: (returnval){ [ 1079.717258] env[61439]: value = "task-987728" [ 1079.717258] env[61439]: _type = "Task" [ 1079.717258] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.724471] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': task-987728, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1080.157224] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1080.157487] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Creating directory with path [datastore2] vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1080.157717] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d80dd117-217b-4d73-b24f-b955130d1253 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.169108] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Created directory with path [datastore2] vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1080.169278] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Fetch image to [datastore2] vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1080.169448] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1080.170174] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15078cea-edb5-431e-b520-8477d1059d3b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.176686] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24d69e25-5d1d-4868-a7fe-13578a0f6acd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.185772] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-715b2428-9468-4251-b3a8-3dc6c3b241a9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.216632] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43146fd8-55ef-4a3f-b4a5-263670067016 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.227474] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-41b68d93-638f-4b37-a6bd-5d4475382a36 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.229072] env[61439]: DEBUG oslo_vmware.api [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': task-987728, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073304} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1080.229307] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1080.229486] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1080.229658] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1080.229834] env[61439]: INFO nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1080.232220] env[61439]: DEBUG nova.compute.claims [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1080.232428] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1080.232647] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1080.249481] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1080.321901] env[61439]: DEBUG oslo_vmware.rw_handles [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1080.382644] env[61439]: DEBUG oslo_vmware.rw_handles [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1080.382804] env[61439]: DEBUG oslo_vmware.rw_handles [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1080.432160] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-561594c5-9284-48e4-8f11-770910ac71e7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.440052] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e6b8f5d-ec24-4619-aa3a-6cef02516a36 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.469770] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f3e23e2-54a5-48cb-8203-327aa7aa5c22 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.477355] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48e8b8ae-de75-4711-89ac-72a73a8f0367 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.490949] env[61439]: DEBUG nova.compute.provider_tree [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1080.499105] env[61439]: DEBUG nova.scheduler.client.report [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1080.526323] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.294s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1080.526854] env[61439]: ERROR nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1080.526854] env[61439]: Faults: ['InvalidArgument'] [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] Traceback (most recent call last): [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] self.driver.spawn(context, instance, image_meta, [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] self._fetch_image_if_missing(context, vi) [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] image_cache(vi, tmp_image_ds_loc) [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] vm_util.copy_virtual_disk( [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] session._wait_for_task(vmdk_copy_task) [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] return self.wait_for_task(task_ref) [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] return evt.wait() [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] result = hub.switch() [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] return self.greenlet.switch() [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] self.f(*self.args, **self.kw) [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] raise exceptions.translate_fault(task_info.error) [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] Faults: ['InvalidArgument'] [ 1080.526854] env[61439]: ERROR nova.compute.manager [instance: e51da790-9736-4181-9562-1a8f87895bd2] [ 1080.527576] env[61439]: DEBUG nova.compute.utils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1080.529029] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Build of instance e51da790-9736-4181-9562-1a8f87895bd2 was re-scheduled: A specified parameter was not correct: fileType [ 1080.529029] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1080.529408] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1080.529584] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1080.529766] env[61439]: DEBUG nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1080.529927] env[61439]: DEBUG nova.network.neutron [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1080.959505] env[61439]: DEBUG nova.network.neutron [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1080.975396] env[61439]: INFO nova.compute.manager [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Took 0.45 seconds to deallocate network for instance. [ 1081.110159] env[61439]: INFO nova.scheduler.client.report [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Deleted allocations for instance e51da790-9736-4181-9562-1a8f87895bd2 [ 1081.157866] env[61439]: DEBUG oslo_concurrency.lockutils [None req-e6f19eca-71a3-4525-8921-22c7ebc68175 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "e51da790-9736-4181-9562-1a8f87895bd2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.192s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.158166] env[61439]: DEBUG oslo_concurrency.lockutils [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "e51da790-9736-4181-9562-1a8f87895bd2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 26.426s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1081.158393] env[61439]: DEBUG oslo_concurrency.lockutils [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "e51da790-9736-4181-9562-1a8f87895bd2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1081.158606] env[61439]: DEBUG oslo_concurrency.lockutils [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "e51da790-9736-4181-9562-1a8f87895bd2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1081.158844] env[61439]: DEBUG oslo_concurrency.lockutils [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "e51da790-9736-4181-9562-1a8f87895bd2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.161380] env[61439]: INFO nova.compute.manager [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Terminating instance [ 1081.162948] env[61439]: DEBUG nova.compute.manager [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1081.163171] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1081.163862] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f0c57368-c500-4a5e-82e6-7e79bf781717 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.173808] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a454f3e5-87ff-4737-855e-a698629cea4e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.204337] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e51da790-9736-4181-9562-1a8f87895bd2 could not be found. [ 1081.205026] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1081.205026] env[61439]: INFO nova.compute.manager [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1081.205332] env[61439]: DEBUG oslo.service.loopingcall [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1081.205657] env[61439]: DEBUG nova.compute.manager [-] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1081.205815] env[61439]: DEBUG nova.network.neutron [-] [instance: e51da790-9736-4181-9562-1a8f87895bd2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1081.234046] env[61439]: DEBUG nova.network.neutron [-] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1081.244222] env[61439]: INFO nova.compute.manager [-] [instance: e51da790-9736-4181-9562-1a8f87895bd2] Took 0.04 seconds to deallocate network for instance. [ 1081.369344] env[61439]: DEBUG oslo_concurrency.lockutils [None req-361ef6a6-9059-4dee-aa46-937c09ff7611 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "e51da790-9736-4181-9562-1a8f87895bd2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.211s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1084.719673] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1084.721185] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1084.742261] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1084.833775] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1084.834052] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1084.835887] env[61439]: INFO nova.compute.claims [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1084.975979] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79073b51-c979-458f-99cf-282fe8348ae7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1084.984325] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-818257e8-9429-45c7-89bb-5966d290144b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1085.015013] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c6b2524-2911-441f-ba67-7d735560c259 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1085.022084] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e2bc33-c474-4eb7-81ed-34e6ebf2aaba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1085.035517] env[61439]: DEBUG nova.compute.provider_tree [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1085.046034] env[61439]: DEBUG nova.scheduler.client.report [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1085.073609] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1085.074142] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1085.106691] env[61439]: DEBUG nova.compute.utils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1085.107986] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1085.108577] env[61439]: DEBUG nova.network.neutron [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1085.119036] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1085.194066] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1085.222920] env[61439]: DEBUG nova.policy [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5c5974ebfc844e4c8a2947542dd55524', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ff88efffe9443a39391aab1d573993a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1085.233135] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1085.233377] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1085.233543] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1085.233723] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1085.233870] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1085.234028] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1085.234239] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1085.234400] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1085.234571] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1085.234747] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1085.234934] env[61439]: DEBUG nova.virt.hardware [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1085.236033] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5bd04ed-9639-40c3-bdcd-10010698b20b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1085.244140] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-068e3ad1-6da7-4e86-8341-0ab1a7ebedd2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1085.660202] env[61439]: DEBUG nova.network.neutron [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Successfully created port: cfa8d549-cc91-44b2-81b9-dc585ca82ba1 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1086.291428] env[61439]: DEBUG nova.compute.manager [req-5797c77a-0842-4521-861f-7c091616666b req-68622af0-dc55-4ee1-ae93-e702ccc683b5 service nova] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Received event network-vif-plugged-cfa8d549-cc91-44b2-81b9-dc585ca82ba1 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1086.291751] env[61439]: DEBUG oslo_concurrency.lockutils [req-5797c77a-0842-4521-861f-7c091616666b req-68622af0-dc55-4ee1-ae93-e702ccc683b5 service nova] Acquiring lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1086.291974] env[61439]: DEBUG oslo_concurrency.lockutils [req-5797c77a-0842-4521-861f-7c091616666b req-68622af0-dc55-4ee1-ae93-e702ccc683b5 service nova] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1086.292163] env[61439]: DEBUG oslo_concurrency.lockutils [req-5797c77a-0842-4521-861f-7c091616666b req-68622af0-dc55-4ee1-ae93-e702ccc683b5 service nova] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1086.292336] env[61439]: DEBUG nova.compute.manager [req-5797c77a-0842-4521-861f-7c091616666b req-68622af0-dc55-4ee1-ae93-e702ccc683b5 service nova] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] No waiting events found dispatching network-vif-plugged-cfa8d549-cc91-44b2-81b9-dc585ca82ba1 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1086.292682] env[61439]: WARNING nova.compute.manager [req-5797c77a-0842-4521-861f-7c091616666b req-68622af0-dc55-4ee1-ae93-e702ccc683b5 service nova] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Received unexpected event network-vif-plugged-cfa8d549-cc91-44b2-81b9-dc585ca82ba1 for instance with vm_state building and task_state spawning. [ 1086.421290] env[61439]: DEBUG nova.network.neutron [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Successfully updated port: cfa8d549-cc91-44b2-81b9-dc585ca82ba1 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1086.445537] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "refresh_cache-42999bc8-a3be-4310-97ad-324c7f4fc8d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1086.445692] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquired lock "refresh_cache-42999bc8-a3be-4310-97ad-324c7f4fc8d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1086.445845] env[61439]: DEBUG nova.network.neutron [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1086.484781] env[61439]: DEBUG nova.network.neutron [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1086.717994] env[61439]: DEBUG nova.network.neutron [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Updating instance_info_cache with network_info: [{"id": "cfa8d549-cc91-44b2-81b9-dc585ca82ba1", "address": "fa:16:3e:b1:13:38", "network": {"id": "46a4de18-523d-44a2-8e81-92b838d568cc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309006852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ff88efffe9443a39391aab1d573993a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfa8d549-cc", "ovs_interfaceid": "cfa8d549-cc91-44b2-81b9-dc585ca82ba1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1086.728772] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Releasing lock "refresh_cache-42999bc8-a3be-4310-97ad-324c7f4fc8d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1086.729067] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Instance network_info: |[{"id": "cfa8d549-cc91-44b2-81b9-dc585ca82ba1", "address": "fa:16:3e:b1:13:38", "network": {"id": "46a4de18-523d-44a2-8e81-92b838d568cc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309006852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ff88efffe9443a39391aab1d573993a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfa8d549-cc", "ovs_interfaceid": "cfa8d549-cc91-44b2-81b9-dc585ca82ba1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1086.729484] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b1:13:38', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92f3cfd6-c130-4390-8910-865fbc42afd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cfa8d549-cc91-44b2-81b9-dc585ca82ba1', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1086.737286] env[61439]: DEBUG oslo.service.loopingcall [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1086.737734] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1086.737963] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-83deeb2d-39d9-427a-a659-d8ce941aa809 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1086.759244] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1086.759244] env[61439]: value = "task-987729" [ 1086.759244] env[61439]: _type = "Task" [ 1086.759244] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1086.767365] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987729, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1087.270342] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987729, 'name': CreateVM_Task, 'duration_secs': 0.334017} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1087.270342] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1087.270986] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1087.271178] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1087.271515] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1087.271767] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f69684c7-dc54-4c63-811d-fa2f7f43f2e3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1087.276694] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for the task: (returnval){ [ 1087.276694] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]520207d7-e8c0-4ef9-f101-1cd070b9ef36" [ 1087.276694] env[61439]: _type = "Task" [ 1087.276694] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1087.284500] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]520207d7-e8c0-4ef9-f101-1cd070b9ef36, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1087.788625] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1087.788893] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1087.789097] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1088.554072] env[61439]: DEBUG nova.compute.manager [req-3d03a66f-dc0f-49a3-ae57-34499510d2a0 req-5faca3ce-4640-4850-94ec-12e656df0f08 service nova] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Received event network-changed-cfa8d549-cc91-44b2-81b9-dc585ca82ba1 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1088.554293] env[61439]: DEBUG nova.compute.manager [req-3d03a66f-dc0f-49a3-ae57-34499510d2a0 req-5faca3ce-4640-4850-94ec-12e656df0f08 service nova] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Refreshing instance network info cache due to event network-changed-cfa8d549-cc91-44b2-81b9-dc585ca82ba1. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1088.554524] env[61439]: DEBUG oslo_concurrency.lockutils [req-3d03a66f-dc0f-49a3-ae57-34499510d2a0 req-5faca3ce-4640-4850-94ec-12e656df0f08 service nova] Acquiring lock "refresh_cache-42999bc8-a3be-4310-97ad-324c7f4fc8d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1088.554711] env[61439]: DEBUG oslo_concurrency.lockutils [req-3d03a66f-dc0f-49a3-ae57-34499510d2a0 req-5faca3ce-4640-4850-94ec-12e656df0f08 service nova] Acquired lock "refresh_cache-42999bc8-a3be-4310-97ad-324c7f4fc8d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1088.554898] env[61439]: DEBUG nova.network.neutron [req-3d03a66f-dc0f-49a3-ae57-34499510d2a0 req-5faca3ce-4640-4850-94ec-12e656df0f08 service nova] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Refreshing network info cache for port cfa8d549-cc91-44b2-81b9-dc585ca82ba1 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1088.830542] env[61439]: DEBUG nova.network.neutron [req-3d03a66f-dc0f-49a3-ae57-34499510d2a0 req-5faca3ce-4640-4850-94ec-12e656df0f08 service nova] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Updated VIF entry in instance network info cache for port cfa8d549-cc91-44b2-81b9-dc585ca82ba1. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1088.830894] env[61439]: DEBUG nova.network.neutron [req-3d03a66f-dc0f-49a3-ae57-34499510d2a0 req-5faca3ce-4640-4850-94ec-12e656df0f08 service nova] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Updating instance_info_cache with network_info: [{"id": "cfa8d549-cc91-44b2-81b9-dc585ca82ba1", "address": "fa:16:3e:b1:13:38", "network": {"id": "46a4de18-523d-44a2-8e81-92b838d568cc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309006852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ff88efffe9443a39391aab1d573993a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfa8d549-cc", "ovs_interfaceid": "cfa8d549-cc91-44b2-81b9-dc585ca82ba1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1088.841659] env[61439]: DEBUG oslo_concurrency.lockutils [req-3d03a66f-dc0f-49a3-ae57-34499510d2a0 req-5faca3ce-4640-4850-94ec-12e656df0f08 service nova] Releasing lock "refresh_cache-42999bc8-a3be-4310-97ad-324c7f4fc8d4" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1126.523851] env[61439]: WARNING oslo_vmware.rw_handles [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1126.523851] env[61439]: ERROR oslo_vmware.rw_handles [ 1126.524754] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1126.526851] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1126.527187] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Copying Virtual Disk [datastore2] vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/201b99d1-c3b1-4786-abb5-8078c1f265cf/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1126.527567] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-955f79a7-f3ff-4101-81ae-bfc40bfb5a1a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1126.536494] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for the task: (returnval){ [ 1126.536494] env[61439]: value = "task-987730" [ 1126.536494] env[61439]: _type = "Task" [ 1126.536494] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1126.544815] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Task: {'id': task-987730, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1127.047693] env[61439]: DEBUG oslo_vmware.exceptions [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1127.048743] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1127.048910] env[61439]: ERROR nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1127.048910] env[61439]: Faults: ['InvalidArgument'] [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Traceback (most recent call last): [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] yield resources [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] self.driver.spawn(context, instance, image_meta, [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] self._fetch_image_if_missing(context, vi) [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] image_cache(vi, tmp_image_ds_loc) [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] vm_util.copy_virtual_disk( [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] session._wait_for_task(vmdk_copy_task) [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] return self.wait_for_task(task_ref) [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] return evt.wait() [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] result = hub.switch() [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] return self.greenlet.switch() [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] self.f(*self.args, **self.kw) [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] raise exceptions.translate_fault(task_info.error) [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Faults: ['InvalidArgument'] [ 1127.048910] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] [ 1127.050497] env[61439]: INFO nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Terminating instance [ 1127.050855] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1127.051078] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1127.051590] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11854794-d3e0-4b22-a68a-81cc6b826d36 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.053916] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1127.054600] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1127.054837] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1ecf3ad-e1d8-490c-85c5-d1fa8490ade1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.062156] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1127.062387] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d611cb73-9b1e-4d76-8709-0ccc5e396fd2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.064744] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1127.064917] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1127.065944] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d4fa9dc9-bf6f-4dde-a016-d06d193f7ead {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.071026] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Waiting for the task: (returnval){ [ 1127.071026] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5216a3e6-51ef-1b6a-fd5f-c32a855a44fd" [ 1127.071026] env[61439]: _type = "Task" [ 1127.071026] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1127.078869] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5216a3e6-51ef-1b6a-fd5f-c32a855a44fd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1127.175141] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1127.175379] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1127.175566] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleting the datastore file [datastore2] 8089bd3f-47e7-4490-8bfc-a1d87bf559ef {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1127.175856] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3f95468a-c0fa-4def-9098-c5ca243226e2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.182210] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for the task: (returnval){ [ 1127.182210] env[61439]: value = "task-987732" [ 1127.182210] env[61439]: _type = "Task" [ 1127.182210] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1127.190075] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Task: {'id': task-987732, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1127.581105] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1127.581486] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Creating directory with path [datastore2] vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1127.581594] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2393e09b-2c3a-4b0a-b08d-d890d97cab22 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.593112] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Created directory with path [datastore2] vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1127.593309] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Fetch image to [datastore2] vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1127.593503] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1127.594276] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3bc2589-7841-416d-81fc-47d3becadd7a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.600806] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0d26595-3c61-4f91-a324-c6958d543adf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.610772] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db477473-f383-4b25-a814-95b65b7fc78b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.640320] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-707e2132-a9cb-4004-9362-d895209bf555 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.646015] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-712694ad-327a-4940-84fb-913405f91284 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.665280] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1127.695722] env[61439]: DEBUG oslo_vmware.api [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Task: {'id': task-987732, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080045} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1127.695976] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1127.696182] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1127.696356] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1127.696529] env[61439]: INFO nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1127.698605] env[61439]: DEBUG nova.compute.claims [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1127.698782] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1127.698998] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1127.715680] env[61439]: DEBUG oslo_vmware.rw_handles [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1127.777724] env[61439]: DEBUG oslo_vmware.rw_handles [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1127.777915] env[61439]: DEBUG oslo_vmware.rw_handles [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1127.871006] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6a51ad7-18f2-4d42-baa8-01c1b9d14aa6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.879183] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c97a6d67-55a5-4d57-881a-096d961ac408 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.910525] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f02d100c-7952-4b7b-a798-79fc861ccd3c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.917716] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9cdc79d-e963-461b-9a09-80c27f2991ec {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.930823] env[61439]: DEBUG nova.compute.provider_tree [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1127.938982] env[61439]: DEBUG nova.scheduler.client.report [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1127.953190] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.254s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1127.953727] env[61439]: ERROR nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1127.953727] env[61439]: Faults: ['InvalidArgument'] [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Traceback (most recent call last): [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] self.driver.spawn(context, instance, image_meta, [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] self._fetch_image_if_missing(context, vi) [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] image_cache(vi, tmp_image_ds_loc) [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] vm_util.copy_virtual_disk( [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] session._wait_for_task(vmdk_copy_task) [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] return self.wait_for_task(task_ref) [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] return evt.wait() [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] result = hub.switch() [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] return self.greenlet.switch() [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] self.f(*self.args, **self.kw) [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] raise exceptions.translate_fault(task_info.error) [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Faults: ['InvalidArgument'] [ 1127.953727] env[61439]: ERROR nova.compute.manager [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] [ 1127.954497] env[61439]: DEBUG nova.compute.utils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1127.956046] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Build of instance 8089bd3f-47e7-4490-8bfc-a1d87bf559ef was re-scheduled: A specified parameter was not correct: fileType [ 1127.956046] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1127.956253] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1127.956430] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1127.956600] env[61439]: DEBUG nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1127.956765] env[61439]: DEBUG nova.network.neutron [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1128.356543] env[61439]: DEBUG nova.network.neutron [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1128.369148] env[61439]: INFO nova.compute.manager [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Took 0.41 seconds to deallocate network for instance. [ 1128.459474] env[61439]: INFO nova.scheduler.client.report [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Deleted allocations for instance 8089bd3f-47e7-4490-8bfc-a1d87bf559ef [ 1128.479127] env[61439]: DEBUG oslo_concurrency.lockutils [None req-27ca987c-126d-47a1-8cf5-daadd80d35a6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 267.885s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1128.479373] env[61439]: DEBUG oslo_concurrency.lockutils [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 71.785s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1128.479606] env[61439]: DEBUG oslo_concurrency.lockutils [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Acquiring lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1128.479830] env[61439]: DEBUG oslo_concurrency.lockutils [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1128.479976] env[61439]: DEBUG oslo_concurrency.lockutils [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1128.483046] env[61439]: INFO nova.compute.manager [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Terminating instance [ 1128.483542] env[61439]: DEBUG nova.compute.manager [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1128.483739] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1128.484207] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a6fdec55-bcd8-4ced-87ac-9f2c06111900 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.493533] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a3f1f1a-ca37-4546-8482-e77375a2dec9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.518424] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8089bd3f-47e7-4490-8bfc-a1d87bf559ef could not be found. [ 1128.518632] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1128.518955] env[61439]: INFO nova.compute.manager [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1128.519071] env[61439]: DEBUG oslo.service.loopingcall [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1128.519333] env[61439]: DEBUG nova.compute.manager [-] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1128.519392] env[61439]: DEBUG nova.network.neutron [-] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1128.541919] env[61439]: DEBUG nova.network.neutron [-] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1128.549592] env[61439]: INFO nova.compute.manager [-] [instance: 8089bd3f-47e7-4490-8bfc-a1d87bf559ef] Took 0.03 seconds to deallocate network for instance. [ 1128.639313] env[61439]: DEBUG oslo_concurrency.lockutils [None req-59b05444-64eb-4cf8-8e9f-9260f88109c6 tempest-DeleteServersTestJSON-64622698 tempest-DeleteServersTestJSON-64622698-project-member] Lock "8089bd3f-47e7-4490-8bfc-a1d87bf559ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.160s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1132.425166] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquiring lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.201700] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1134.201952] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1134.215942] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.216191] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1134.216361] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1134.216868] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1134.218014] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df13b4c3-ad2e-4c7e-9733-444849d41197 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.227378] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b782cd6-bb33-4a39-8604-44fe19467786 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.241786] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98b6ba28-814b-428e-b329-756ba50154dd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.248516] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb53d5c2-d6a6-4067-836c-64ee10605ad4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.278730] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181532MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1134.278846] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.279068] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1134.339599] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1134.339599] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance b954e159-4d89-4c61-a5bc-5e6c67cf278c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1134.339599] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 2b55d3f3-cff9-4e34-936e-ece6759cfd40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1134.339599] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1134.339599] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1134.339891] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 42999bc8-a3be-4310-97ad-324c7f4fc8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1134.359206] env[61439]: INFO nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5403acfb-04d2-4081-80c7-23d662410a20 has allocations against this compute host but is not found in the database. [ 1134.359469] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1134.359636] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1134.444708] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquiring lock "5403acfb-04d2-4081-80c7-23d662410a20" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.445186] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "5403acfb-04d2-4081-80c7-23d662410a20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1134.461654] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1134.472966] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ed6c968-6382-4aad-8297-0345268c8781 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.480564] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c76d592b-5f0d-433b-83b1-bed15acfe522 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.513972] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12c8259d-ba88-47e1-a67b-53e5e283c32f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.523369] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cfb185f-6ade-4070-8ec3-27d1e6f35c07 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.529314] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.537180] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1134.545152] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1134.557588] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1134.557822] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.279s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1134.558037] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.029s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1134.559848] env[61439]: INFO nova.compute.claims [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1134.694818] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14a112fc-96b7-4921-93ce-89aa5eaf5824 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.703891] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34b8f284-3329-4e54-97b6-feb39e8aadbe {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.733361] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a4d6492-a429-412f-9022-8191395fac31 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.740492] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40e7944a-09ae-4aee-9bc8-8ea4abdcf1e0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.753209] env[61439]: DEBUG nova.compute.provider_tree [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1134.763237] env[61439]: DEBUG nova.scheduler.client.report [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1134.777621] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1134.778107] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1134.818754] env[61439]: DEBUG nova.compute.utils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1134.819998] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1134.820191] env[61439]: DEBUG nova.network.neutron [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1134.828244] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1134.876919] env[61439]: DEBUG nova.policy [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '576b47a5c78d458fa934561690148b4e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2657ae3507d340b6950c02b104d16f7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1134.898082] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1134.924719] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1134.924965] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1134.925143] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1134.925328] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1134.925475] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1134.925621] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1134.925828] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1134.925987] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1134.926164] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1134.926329] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1134.926502] env[61439]: DEBUG nova.virt.hardware [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1134.927451] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be8f8f39-1c91-4680-b012-8e2b7225b372 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.935667] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5895b8e4-44db-409a-b528-707d47785061 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1135.304635] env[61439]: DEBUG nova.network.neutron [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Successfully created port: aab424be-9787-4c46-b691-bb2cb3255f81 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1135.557522] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1135.557522] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1135.897872] env[61439]: DEBUG nova.compute.manager [req-6e4e5962-c9dc-43a7-a443-8d633577889b req-50e8095d-f6d5-4ed3-b151-312318eba2e7 service nova] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Received event network-vif-plugged-aab424be-9787-4c46-b691-bb2cb3255f81 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1135.897872] env[61439]: DEBUG oslo_concurrency.lockutils [req-6e4e5962-c9dc-43a7-a443-8d633577889b req-50e8095d-f6d5-4ed3-b151-312318eba2e7 service nova] Acquiring lock "5403acfb-04d2-4081-80c7-23d662410a20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1135.897872] env[61439]: DEBUG oslo_concurrency.lockutils [req-6e4e5962-c9dc-43a7-a443-8d633577889b req-50e8095d-f6d5-4ed3-b151-312318eba2e7 service nova] Lock "5403acfb-04d2-4081-80c7-23d662410a20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1135.897872] env[61439]: DEBUG oslo_concurrency.lockutils [req-6e4e5962-c9dc-43a7-a443-8d633577889b req-50e8095d-f6d5-4ed3-b151-312318eba2e7 service nova] Lock "5403acfb-04d2-4081-80c7-23d662410a20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1135.898657] env[61439]: DEBUG nova.compute.manager [req-6e4e5962-c9dc-43a7-a443-8d633577889b req-50e8095d-f6d5-4ed3-b151-312318eba2e7 service nova] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] No waiting events found dispatching network-vif-plugged-aab424be-9787-4c46-b691-bb2cb3255f81 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1135.899010] env[61439]: WARNING nova.compute.manager [req-6e4e5962-c9dc-43a7-a443-8d633577889b req-50e8095d-f6d5-4ed3-b151-312318eba2e7 service nova] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Received unexpected event network-vif-plugged-aab424be-9787-4c46-b691-bb2cb3255f81 for instance with vm_state building and task_state spawning. [ 1135.975905] env[61439]: DEBUG nova.network.neutron [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Successfully updated port: aab424be-9787-4c46-b691-bb2cb3255f81 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1135.989888] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquiring lock "refresh_cache-5403acfb-04d2-4081-80c7-23d662410a20" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1135.990047] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquired lock "refresh_cache-5403acfb-04d2-4081-80c7-23d662410a20" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1135.990199] env[61439]: DEBUG nova.network.neutron [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1136.026759] env[61439]: DEBUG nova.network.neutron [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1136.201137] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1136.201264] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1136.201396] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1136.221193] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1136.221400] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1136.221499] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1136.221627] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1136.221754] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1136.221880] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1136.222013] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1136.222147] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1136.222665] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1136.222841] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1136.223165] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1136.226142] env[61439]: DEBUG nova.network.neutron [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Updating instance_info_cache with network_info: [{"id": "aab424be-9787-4c46-b691-bb2cb3255f81", "address": "fa:16:3e:01:43:37", "network": {"id": "af782100-d922-412a-8661-1f4821d6928c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1581660408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2657ae3507d340b6950c02b104d16f7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaab424be-97", "ovs_interfaceid": "aab424be-9787-4c46-b691-bb2cb3255f81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1136.238214] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Releasing lock "refresh_cache-5403acfb-04d2-4081-80c7-23d662410a20" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1136.238494] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Instance network_info: |[{"id": "aab424be-9787-4c46-b691-bb2cb3255f81", "address": "fa:16:3e:01:43:37", "network": {"id": "af782100-d922-412a-8661-1f4821d6928c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1581660408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2657ae3507d340b6950c02b104d16f7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaab424be-97", "ovs_interfaceid": "aab424be-9787-4c46-b691-bb2cb3255f81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1136.238867] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:01:43:37', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ea45c024-d603-4bac-9c1b-f302437ea4fe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'aab424be-9787-4c46-b691-bb2cb3255f81', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1136.246187] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Creating folder: Project (2657ae3507d340b6950c02b104d16f7d). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1136.246647] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-35130b88-6238-4cbd-9fb3-289e525460cd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1136.256477] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Created folder: Project (2657ae3507d340b6950c02b104d16f7d) in parent group-v221281. [ 1136.256673] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Creating folder: Instances. Parent ref: group-v221332. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1136.256928] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8b0b1a09-e6f9-4a30-9c06-cd4878d302cc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1136.265213] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Created folder: Instances in parent group-v221332. [ 1136.265434] env[61439]: DEBUG oslo.service.loopingcall [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1136.265605] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1136.265837] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a14fff42-7dd2-40f4-8ca4-d8ba6edbcba6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1136.283676] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1136.283676] env[61439]: value = "task-987735" [ 1136.283676] env[61439]: _type = "Task" [ 1136.283676] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1136.290794] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987735, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1136.793521] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987735, 'name': CreateVM_Task, 'duration_secs': 0.313561} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1136.793848] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1136.794398] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1136.794662] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1136.795028] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1136.795283] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3a3bfd15-55a3-40ba-9cc6-ade5986e1fa4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1136.800015] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Waiting for the task: (returnval){ [ 1136.800015] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52ee785a-1459-22ff-e9b0-73cb655180cb" [ 1136.800015] env[61439]: _type = "Task" [ 1136.800015] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1136.807438] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52ee785a-1459-22ff-e9b0-73cb655180cb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1137.202657] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1137.310327] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1137.310659] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1137.310914] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1137.923223] env[61439]: DEBUG nova.compute.manager [req-46dde90f-b3fc-405b-bfb3-ae6f74e80803 req-60835987-d2e2-4335-8d8e-fb29ef85d3ad service nova] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Received event network-changed-aab424be-9787-4c46-b691-bb2cb3255f81 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1137.923458] env[61439]: DEBUG nova.compute.manager [req-46dde90f-b3fc-405b-bfb3-ae6f74e80803 req-60835987-d2e2-4335-8d8e-fb29ef85d3ad service nova] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Refreshing instance network info cache due to event network-changed-aab424be-9787-4c46-b691-bb2cb3255f81. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1137.923661] env[61439]: DEBUG oslo_concurrency.lockutils [req-46dde90f-b3fc-405b-bfb3-ae6f74e80803 req-60835987-d2e2-4335-8d8e-fb29ef85d3ad service nova] Acquiring lock "refresh_cache-5403acfb-04d2-4081-80c7-23d662410a20" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1137.923811] env[61439]: DEBUG oslo_concurrency.lockutils [req-46dde90f-b3fc-405b-bfb3-ae6f74e80803 req-60835987-d2e2-4335-8d8e-fb29ef85d3ad service nova] Acquired lock "refresh_cache-5403acfb-04d2-4081-80c7-23d662410a20" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1137.923974] env[61439]: DEBUG nova.network.neutron [req-46dde90f-b3fc-405b-bfb3-ae6f74e80803 req-60835987-d2e2-4335-8d8e-fb29ef85d3ad service nova] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Refreshing network info cache for port aab424be-9787-4c46-b691-bb2cb3255f81 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1138.152973] env[61439]: DEBUG nova.network.neutron [req-46dde90f-b3fc-405b-bfb3-ae6f74e80803 req-60835987-d2e2-4335-8d8e-fb29ef85d3ad service nova] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Updated VIF entry in instance network info cache for port aab424be-9787-4c46-b691-bb2cb3255f81. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1138.153334] env[61439]: DEBUG nova.network.neutron [req-46dde90f-b3fc-405b-bfb3-ae6f74e80803 req-60835987-d2e2-4335-8d8e-fb29ef85d3ad service nova] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Updating instance_info_cache with network_info: [{"id": "aab424be-9787-4c46-b691-bb2cb3255f81", "address": "fa:16:3e:01:43:37", "network": {"id": "af782100-d922-412a-8661-1f4821d6928c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1581660408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2657ae3507d340b6950c02b104d16f7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaab424be-97", "ovs_interfaceid": "aab424be-9787-4c46-b691-bb2cb3255f81", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1138.162159] env[61439]: DEBUG oslo_concurrency.lockutils [req-46dde90f-b3fc-405b-bfb3-ae6f74e80803 req-60835987-d2e2-4335-8d8e-fb29ef85d3ad service nova] Releasing lock "refresh_cache-5403acfb-04d2-4081-80c7-23d662410a20" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1138.201592] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1177.671630] env[61439]: WARNING oslo_vmware.rw_handles [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1177.671630] env[61439]: ERROR oslo_vmware.rw_handles [ 1177.672313] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1177.674075] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1177.674333] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Copying Virtual Disk [datastore2] vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/c7e76b98-d8ff-4c17-8760-e5be3385fe6d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1177.674632] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-290b8e0a-8301-4228-8d18-3454e1e584fd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.682395] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Waiting for the task: (returnval){ [ 1177.682395] env[61439]: value = "task-987736" [ 1177.682395] env[61439]: _type = "Task" [ 1177.682395] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1177.690752] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Task: {'id': task-987736, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1178.192725] env[61439]: DEBUG oslo_vmware.exceptions [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1178.192969] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1178.193575] env[61439]: ERROR nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1178.193575] env[61439]: Faults: ['InvalidArgument'] [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Traceback (most recent call last): [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] yield resources [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] self.driver.spawn(context, instance, image_meta, [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] self._fetch_image_if_missing(context, vi) [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] image_cache(vi, tmp_image_ds_loc) [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] vm_util.copy_virtual_disk( [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] session._wait_for_task(vmdk_copy_task) [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] return self.wait_for_task(task_ref) [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] return evt.wait() [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] result = hub.switch() [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] return self.greenlet.switch() [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] self.f(*self.args, **self.kw) [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] raise exceptions.translate_fault(task_info.error) [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Faults: ['InvalidArgument'] [ 1178.193575] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] [ 1178.194561] env[61439]: INFO nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Terminating instance [ 1178.195399] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1178.195607] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1178.195831] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-607cf6f6-add1-4a29-80f4-06ab8c095810 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.197907] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1178.198120] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1178.198793] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6e8b9a5-409c-4413-ae92-6878be6061ac {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.205327] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1178.205536] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8d30b1b8-e115-48a8-9cc2-7fc3de6bfcf6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.207588] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1178.207762] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1178.208700] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-21ebbbc6-ebb9-42a2-b6a8-7daa61a17f69 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.213837] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for the task: (returnval){ [ 1178.213837] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]526e3976-bf9c-1b21-a41d-a98e54e06744" [ 1178.213837] env[61439]: _type = "Task" [ 1178.213837] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1178.221230] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]526e3976-bf9c-1b21-a41d-a98e54e06744, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1178.271184] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1178.271400] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1178.271581] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Deleting the datastore file [datastore2] 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1178.271834] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-05f22717-cce6-41f4-b242-1e3bf8e2224e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.277890] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Waiting for the task: (returnval){ [ 1178.277890] env[61439]: value = "task-987738" [ 1178.277890] env[61439]: _type = "Task" [ 1178.277890] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1178.285172] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Task: {'id': task-987738, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1178.723296] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1178.723650] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Creating directory with path [datastore2] vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1178.723819] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d969d6d3-0d6f-4b52-8d9b-1c967495038b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.734680] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Created directory with path [datastore2] vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1178.734872] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Fetch image to [datastore2] vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1178.735057] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1178.735765] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0116d6f8-8ce9-4478-8aad-0801a7d6d5d9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.742021] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f30499d-4dd8-4a01-a3dc-52583c2d76e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.751494] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eea14f2-a39f-40ad-abc9-d19638202d6f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.785336] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bf8427f-524e-4048-929d-06d292bb4c96 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.792360] env[61439]: DEBUG oslo_vmware.api [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Task: {'id': task-987738, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070465} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1178.793868] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1178.794079] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1178.794258] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1178.794435] env[61439]: INFO nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1178.796188] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bd04e0c5-e52d-40ee-9ba4-fcfcca6f55aa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.798083] env[61439]: DEBUG nova.compute.claims [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1178.798262] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1178.798478] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1178.817841] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1178.867467] env[61439]: DEBUG oslo_vmware.rw_handles [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1178.926087] env[61439]: DEBUG oslo_vmware.rw_handles [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1178.926288] env[61439]: DEBUG oslo_vmware.rw_handles [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1178.984821] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-129d4ff1-8529-4e0b-b76b-367aae784477 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.992245] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04bb6d9a-7eac-401c-a736-808777bf0906 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.021529] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-008aa13f-4a58-4416-97a8-a0d19ebc14d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.028243] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d1f1530-6911-465f-b887-7ff4d3960213 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.041857] env[61439]: DEBUG nova.compute.provider_tree [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1179.050733] env[61439]: DEBUG nova.scheduler.client.report [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1179.069712] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.271s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1179.070266] env[61439]: ERROR nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1179.070266] env[61439]: Faults: ['InvalidArgument'] [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Traceback (most recent call last): [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] self.driver.spawn(context, instance, image_meta, [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] self._fetch_image_if_missing(context, vi) [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] image_cache(vi, tmp_image_ds_loc) [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] vm_util.copy_virtual_disk( [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] session._wait_for_task(vmdk_copy_task) [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] return self.wait_for_task(task_ref) [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] return evt.wait() [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] result = hub.switch() [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] return self.greenlet.switch() [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] self.f(*self.args, **self.kw) [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] raise exceptions.translate_fault(task_info.error) [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Faults: ['InvalidArgument'] [ 1179.070266] env[61439]: ERROR nova.compute.manager [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] [ 1179.071715] env[61439]: DEBUG nova.compute.utils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1179.072675] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Build of instance 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 was re-scheduled: A specified parameter was not correct: fileType [ 1179.072675] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1179.073061] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1179.073236] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1179.073408] env[61439]: DEBUG nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1179.073594] env[61439]: DEBUG nova.network.neutron [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1179.507340] env[61439]: DEBUG nova.network.neutron [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1179.518904] env[61439]: INFO nova.compute.manager [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Took 0.45 seconds to deallocate network for instance. [ 1179.603894] env[61439]: INFO nova.scheduler.client.report [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Deleted allocations for instance 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 [ 1179.622689] env[61439]: DEBUG oslo_concurrency.lockutils [None req-da2c9d5f-617a-4658-a4f7-9a2b8b18d74e tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 319.421s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1179.622938] env[61439]: DEBUG oslo_concurrency.lockutils [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 122.749s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1179.623207] env[61439]: DEBUG oslo_concurrency.lockutils [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Acquiring lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1179.623457] env[61439]: DEBUG oslo_concurrency.lockutils [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1179.623670] env[61439]: DEBUG oslo_concurrency.lockutils [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1179.625582] env[61439]: INFO nova.compute.manager [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Terminating instance [ 1179.627232] env[61439]: DEBUG nova.compute.manager [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1179.627425] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1179.627878] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-39f2165a-08ed-490b-9469-b0ab808ca4ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.637235] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b05a8ec-c6d8-4c46-8289-60b58d36dee6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.662502] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7f0c1eef-750c-4d8f-8d90-a02898fdeee1 could not be found. [ 1179.662705] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1179.662882] env[61439]: INFO nova.compute.manager [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1179.663134] env[61439]: DEBUG oslo.service.loopingcall [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1179.663359] env[61439]: DEBUG nova.compute.manager [-] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1179.663460] env[61439]: DEBUG nova.network.neutron [-] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1179.692153] env[61439]: DEBUG nova.network.neutron [-] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1179.699466] env[61439]: INFO nova.compute.manager [-] [instance: 7f0c1eef-750c-4d8f-8d90-a02898fdeee1] Took 0.04 seconds to deallocate network for instance. [ 1179.785427] env[61439]: DEBUG oslo_concurrency.lockutils [None req-44d140a9-3bb6-405c-bf41-7f780cdf25f4 tempest-ServerRescueTestJSON-1180270528 tempest-ServerRescueTestJSON-1180270528-project-member] Lock "7f0c1eef-750c-4d8f-8d90-a02898fdeee1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1192.202566] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1192.202894] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Cleaning up deleted instances {{(pid=61439) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1192.216424] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] There are 0 instances to clean {{(pid=61439) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1193.201387] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1193.201624] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Cleaning up deleted instances with incomplete migration {{(pid=61439) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1194.202298] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1195.205313] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1195.205713] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1196.201446] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1196.201692] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1196.201864] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1196.212801] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1196.213124] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1196.213202] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1196.213336] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1196.214434] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf66ebec-7e26-4c5c-a150-31e2f559ff30 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.224008] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-754b883c-9e28-417f-b417-8e59d815d784 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.237535] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e61aba54-8583-4c83-9596-6ec558a2128a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.243588] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9da0794-f081-4fea-809c-c8f8677f8158 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.271645] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181566MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1196.271818] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1196.271971] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1196.341134] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance b954e159-4d89-4c61-a5bc-5e6c67cf278c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1196.341302] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 2b55d3f3-cff9-4e34-936e-ece6759cfd40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1196.341432] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1196.341560] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1196.341680] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 42999bc8-a3be-4310-97ad-324c7f4fc8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1196.341799] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5403acfb-04d2-4081-80c7-23d662410a20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1196.341981] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1196.342138] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1196.418023] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaebb226-2dc3-4797-b5c1-50f1f3544bc5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.425735] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-962b9924-3f68-4887-82d2-f1e9e78b9302 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.456101] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9faea7f-01a8-49f0-8c6e-ed975c178ac0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.462785] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85d5eb2b-6483-49fc-8821-9a71df0a6e63 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.475530] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1196.483386] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1196.496116] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1196.496307] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.496616] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1197.496966] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1198.201870] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1198.202073] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1198.202244] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1198.217785] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1198.217785] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1198.217940] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1198.218054] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1198.218185] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1198.218309] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1198.218433] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1198.218944] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1200.203350] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1200.223283] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1204.243711] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_power_states {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1204.259814] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Getting list of instances from cluster (obj){ [ 1204.259814] env[61439]: value = "domain-c8" [ 1204.259814] env[61439]: _type = "ClusterComputeResource" [ 1204.259814] env[61439]: } {{(pid=61439) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1204.261083] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db5cf1fd-76c4-476d-bfe2-ee0d704c3fb7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1204.274986] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Got total of 6 instances {{(pid=61439) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1204.275166] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Triggering sync for uuid b954e159-4d89-4c61-a5bc-5e6c67cf278c {{(pid=61439) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1204.275364] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Triggering sync for uuid 2b55d3f3-cff9-4e34-936e-ece6759cfd40 {{(pid=61439) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1204.275525] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Triggering sync for uuid aeeb7c6c-7413-46b0-8632-c7224620e9b2 {{(pid=61439) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1204.275684] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Triggering sync for uuid 7b48729f-86a7-4b53-ad11-ef8a929ec947 {{(pid=61439) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1204.275837] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Triggering sync for uuid 42999bc8-a3be-4310-97ad-324c7f4fc8d4 {{(pid=61439) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1204.275991] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Triggering sync for uuid 5403acfb-04d2-4081-80c7-23d662410a20 {{(pid=61439) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1204.276314] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1204.276548] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1204.276756] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1204.276958] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1204.277175] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1204.277377] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "5403acfb-04d2-4081-80c7-23d662410a20" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1228.743131] env[61439]: WARNING oslo_vmware.rw_handles [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1228.743131] env[61439]: ERROR oslo_vmware.rw_handles [ 1228.743796] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1228.745639] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1228.745890] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Copying Virtual Disk [datastore2] vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/a5ea7da1-cc07-47c7-aac5-a26af558d49a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1228.746274] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-60d096a3-ed64-447d-a77b-3e2773bf26e6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.754718] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for the task: (returnval){ [ 1228.754718] env[61439]: value = "task-987739" [ 1228.754718] env[61439]: _type = "Task" [ 1228.754718] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1228.764065] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': task-987739, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1229.265260] env[61439]: DEBUG oslo_vmware.exceptions [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1229.265538] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1229.266115] env[61439]: ERROR nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1229.266115] env[61439]: Faults: ['InvalidArgument'] [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Traceback (most recent call last): [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] yield resources [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] self.driver.spawn(context, instance, image_meta, [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] self._fetch_image_if_missing(context, vi) [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] image_cache(vi, tmp_image_ds_loc) [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] vm_util.copy_virtual_disk( [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] session._wait_for_task(vmdk_copy_task) [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] return self.wait_for_task(task_ref) [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] return evt.wait() [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] result = hub.switch() [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] return self.greenlet.switch() [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] self.f(*self.args, **self.kw) [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] raise exceptions.translate_fault(task_info.error) [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Faults: ['InvalidArgument'] [ 1229.266115] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] [ 1229.266963] env[61439]: INFO nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Terminating instance [ 1229.268043] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1229.268272] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1229.268507] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8f8fcb3b-fd21-439e-bf0c-a5aecc0386e7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.270593] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1229.270786] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1229.271492] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71c5a262-1276-42ff-93ee-b79e882ab34f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.278180] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1229.278384] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-158a6a12-80bc-4133-a449-42903e56eb08 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.280502] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1229.280677] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1229.281598] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8505c511-20ed-41a3-9c2f-d498c1e65ed3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.286290] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Waiting for the task: (returnval){ [ 1229.286290] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52311560-fa43-6de7-69c5-23b63bbdf9bd" [ 1229.286290] env[61439]: _type = "Task" [ 1229.286290] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1229.294035] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52311560-fa43-6de7-69c5-23b63bbdf9bd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1229.356997] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1229.357477] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1229.357688] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Deleting the datastore file [datastore2] 2b55d3f3-cff9-4e34-936e-ece6759cfd40 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1229.357962] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8a494cfb-ac1b-46f8-a93b-30cf457bc1cc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.363988] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for the task: (returnval){ [ 1229.363988] env[61439]: value = "task-987741" [ 1229.363988] env[61439]: _type = "Task" [ 1229.363988] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1229.371542] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': task-987741, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1229.797881] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1229.798230] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Creating directory with path [datastore2] vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1229.798452] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b3d05037-4b61-411a-9f07-77e355346805 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.811598] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Created directory with path [datastore2] vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1229.811789] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Fetch image to [datastore2] vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1229.811967] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1229.812743] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11262fe2-a4e5-4394-95e9-df0163ee16fe {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.819181] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e4a0bb-3ea7-411f-8b9f-9151d3ed9795 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.828383] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60844c65-059e-4ea6-b129-931655f915e5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.858739] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb03d8b9-d741-448e-841c-8cf768f451a0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.864456] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-61c30f67-6f63-4dca-953f-2e5a9985b735 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.875232] env[61439]: DEBUG oslo_vmware.api [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': task-987741, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070649} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1229.875495] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1229.875693] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1229.875908] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1229.876104] env[61439]: INFO nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1229.878324] env[61439]: DEBUG nova.compute.claims [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1229.878538] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1229.878766] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1229.890353] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1229.945309] env[61439]: DEBUG oslo_vmware.rw_handles [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1230.005455] env[61439]: DEBUG oslo_vmware.rw_handles [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1230.005653] env[61439]: DEBUG oslo_vmware.rw_handles [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1230.056430] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06b1e52c-d0aa-4aa5-970d-345c7bd29c83 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.063858] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ffcba82-0d3d-44c7-8b4b-f7bb0258f2c9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.093054] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f070445-fa1e-4e50-903d-b6cf9e5902f7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.100071] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e462b113-8055-4ea0-8bee-6544da45e7d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.112817] env[61439]: DEBUG nova.compute.provider_tree [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1230.121150] env[61439]: DEBUG nova.scheduler.client.report [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1230.133472] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.255s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1230.134009] env[61439]: ERROR nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1230.134009] env[61439]: Faults: ['InvalidArgument'] [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Traceback (most recent call last): [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] self.driver.spawn(context, instance, image_meta, [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] self._fetch_image_if_missing(context, vi) [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] image_cache(vi, tmp_image_ds_loc) [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] vm_util.copy_virtual_disk( [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] session._wait_for_task(vmdk_copy_task) [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] return self.wait_for_task(task_ref) [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] return evt.wait() [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] result = hub.switch() [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] return self.greenlet.switch() [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] self.f(*self.args, **self.kw) [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] raise exceptions.translate_fault(task_info.error) [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Faults: ['InvalidArgument'] [ 1230.134009] env[61439]: ERROR nova.compute.manager [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] [ 1230.134769] env[61439]: DEBUG nova.compute.utils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1230.136023] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Build of instance 2b55d3f3-cff9-4e34-936e-ece6759cfd40 was re-scheduled: A specified parameter was not correct: fileType [ 1230.136023] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1230.136405] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1230.136581] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1230.136738] env[61439]: DEBUG nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1230.136903] env[61439]: DEBUG nova.network.neutron [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1230.460386] env[61439]: DEBUG nova.network.neutron [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1230.474217] env[61439]: INFO nova.compute.manager [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] Took 0.34 seconds to deallocate network for instance. [ 1230.566458] env[61439]: INFO nova.scheduler.client.report [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Deleted allocations for instance 2b55d3f3-cff9-4e34-936e-ece6759cfd40 [ 1230.583912] env[61439]: DEBUG oslo_concurrency.lockutils [None req-325872bc-0f93-4c55-a545-7c70ed051360 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 362.654s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1230.584190] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 26.308s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1230.584388] env[61439]: INFO nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 2b55d3f3-cff9-4e34-936e-ece6759cfd40] During sync_power_state the instance has a pending task (spawning). Skip. [ 1230.584566] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "2b55d3f3-cff9-4e34-936e-ece6759cfd40" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1256.202494] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1256.202807] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1256.213178] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1256.213443] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1256.213687] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1256.213882] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1256.214982] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4f7fb9a-cf47-4a69-b7dc-6ac1773ccee2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.223818] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08bceda6-d8fe-4199-bad7-01344b2e73fb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.237521] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-626dca62-4c95-49b9-86b4-4bacb1819f51 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.243560] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c83ac2e-7660-4ba6-b6af-8b88f5c2a1ab {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.271692] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181567MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1256.271837] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1256.272034] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1256.401209] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance b954e159-4d89-4c61-a5bc-5e6c67cf278c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1256.401383] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1256.401516] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1256.401676] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 42999bc8-a3be-4310-97ad-324c7f4fc8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1256.401831] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5403acfb-04d2-4081-80c7-23d662410a20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1256.402042] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1256.402195] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1256.419179] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing inventories for resource provider b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1256.432766] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Updating ProviderTree inventory for provider b35c9fce-988b-4acc-b175-83b202107c41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1256.432970] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Updating inventory in ProviderTree for provider b35c9fce-988b-4acc-b175-83b202107c41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1256.444262] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing aggregate associations for resource provider b35c9fce-988b-4acc-b175-83b202107c41, aggregates: None {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1256.461667] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing trait associations for resource provider b35c9fce-988b-4acc-b175-83b202107c41, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1256.525136] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6547c19-b547-40b9-bfbf-e6b2210319cb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.532950] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95067d22-eb84-4166-b2b9-c5182be96244 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.563282] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-609865df-d59c-4700-b70d-86e5871dab08 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.570237] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-675d338a-7b9b-4a86-b857-147e2d7d41b1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1256.582864] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1256.591150] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1256.603547] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1256.603726] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.332s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1257.602930] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1257.603244] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1257.603371] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1258.201603] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1258.201781] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1258.201912] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1258.216034] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1258.216195] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1258.216330] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1258.216461] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1258.216590] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1258.216716] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1258.217197] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1258.217341] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1259.202654] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1260.203034] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1264.260676] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1264.261122] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1264.271808] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1264.326392] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1264.326700] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1264.328369] env[61439]: INFO nova.compute.claims [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1264.496682] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cca8a6c-26bd-47a9-bbcc-700c6e789a08 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.506724] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a01806b5-64b5-4c51-b6f8-29bcfe6d5d65 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.539877] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20846011-b1a1-48c1-83f9-2c03ac562ecb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.547607] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-907515a4-99b0-497c-a34b-1eb908457023 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.561468] env[61439]: DEBUG nova.compute.provider_tree [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1264.571516] env[61439]: DEBUG nova.scheduler.client.report [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1264.586214] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1264.586751] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1264.625278] env[61439]: DEBUG nova.compute.utils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1264.629861] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1264.629861] env[61439]: DEBUG nova.network.neutron [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1264.645142] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1264.734125] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1264.759110] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1264.759362] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1264.759530] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1264.759718] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1264.759893] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1264.764017] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1264.764017] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1264.764017] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1264.764017] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1264.764017] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1264.764017] env[61439]: DEBUG nova.virt.hardware [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1264.764017] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7a72471-37d0-4e03-92c6-cca6b45300fd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.771791] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b9e00d1-c51c-4b1c-b300-62b9dc716f50 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.843429] env[61439]: DEBUG nova.policy [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf4545757716483485ca9b60bd689a1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e839303682f748a4b5a42c8a9273e388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1265.313013] env[61439]: DEBUG nova.network.neutron [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Successfully created port: 69860ca7-3239-4872-9b6e-0c087819b941 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1265.600391] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1265.600603] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1265.614139] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1265.669743] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1265.670015] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1265.672015] env[61439]: INFO nova.compute.claims [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1265.822506] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04b865e7-113f-4ff4-a238-7ae4a0341849 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.830770] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5091383-7a31-439c-94af-70031e9a1a33 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.863781] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe05e5aa-e560-4d3c-8ae3-07f27ecafa9f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.871448] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b79a8789-afea-459d-9120-b5f5888e9d33 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.886574] env[61439]: DEBUG nova.compute.provider_tree [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1265.900746] env[61439]: DEBUG nova.scheduler.client.report [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1265.914664] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1265.915160] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1265.955547] env[61439]: DEBUG nova.compute.utils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1265.956998] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1265.957354] env[61439]: DEBUG nova.network.neutron [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1265.969188] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1266.042628] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1266.084345] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1266.084980] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1266.085274] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1266.085576] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1266.085841] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1266.086097] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1266.086413] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1266.086738] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1266.087054] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1266.087331] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1266.087619] env[61439]: DEBUG nova.virt.hardware [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1266.088654] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0063838f-2544-4b1c-a0cd-1401e8f0db33 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.098735] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ed1ac9d-4f7b-402c-8648-f0db3dae049e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.115593] env[61439]: DEBUG nova.policy [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '05fe4841eb644cc792f699e538d31b7f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32ed96e3b417472da91886ea192c588b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1266.159530] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "4abe5722-e83b-4c40-9b82-ca84545496c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1266.160011] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "4abe5722-e83b-4c40-9b82-ca84545496c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.176018] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1266.230737] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1266.230737] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.231234] env[61439]: INFO nova.compute.claims [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1266.295309] env[61439]: DEBUG nova.compute.manager [req-cfcff0ae-30b7-4eda-9f2d-ef44186c8f9f req-b30f41f2-3694-4c6b-89e0-5c2ec45f54aa service nova] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Received event network-vif-plugged-69860ca7-3239-4872-9b6e-0c087819b941 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1266.295309] env[61439]: DEBUG oslo_concurrency.lockutils [req-cfcff0ae-30b7-4eda-9f2d-ef44186c8f9f req-b30f41f2-3694-4c6b-89e0-5c2ec45f54aa service nova] Acquiring lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1266.295309] env[61439]: DEBUG oslo_concurrency.lockutils [req-cfcff0ae-30b7-4eda-9f2d-ef44186c8f9f req-b30f41f2-3694-4c6b-89e0-5c2ec45f54aa service nova] Lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.295309] env[61439]: DEBUG oslo_concurrency.lockutils [req-cfcff0ae-30b7-4eda-9f2d-ef44186c8f9f req-b30f41f2-3694-4c6b-89e0-5c2ec45f54aa service nova] Lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.295309] env[61439]: DEBUG nova.compute.manager [req-cfcff0ae-30b7-4eda-9f2d-ef44186c8f9f req-b30f41f2-3694-4c6b-89e0-5c2ec45f54aa service nova] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] No waiting events found dispatching network-vif-plugged-69860ca7-3239-4872-9b6e-0c087819b941 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1266.295309] env[61439]: WARNING nova.compute.manager [req-cfcff0ae-30b7-4eda-9f2d-ef44186c8f9f req-b30f41f2-3694-4c6b-89e0-5c2ec45f54aa service nova] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Received unexpected event network-vif-plugged-69860ca7-3239-4872-9b6e-0c087819b941 for instance with vm_state building and task_state spawning. [ 1266.407019] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5182a7ef-80ac-4fd3-beda-daa55f4e72c0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.414188] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8fcec4d-66fa-4cba-a369-68e7191a1fd1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.445068] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-810e8237-13aa-45d8-8791-4a282597977d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.453361] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f1c785-a0cc-4208-a597-b759e8b153ee {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.466470] env[61439]: DEBUG nova.compute.provider_tree [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1266.475410] env[61439]: DEBUG nova.scheduler.client.report [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1266.490707] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1266.491218] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1266.526013] env[61439]: DEBUG nova.compute.utils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1266.527942] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1266.527942] env[61439]: DEBUG nova.network.neutron [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1266.537908] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1266.594165] env[61439]: DEBUG nova.policy [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '05fe4841eb644cc792f699e538d31b7f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32ed96e3b417472da91886ea192c588b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1266.607874] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1266.633127] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1266.633741] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1266.634098] env[61439]: DEBUG nova.virt.hardware [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1266.635831] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09bf330b-ee90-4552-bb82-2692b0f752fc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.645458] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b75e2ee5-ace9-4ab5-85c7-f203be0e94c4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.690853] env[61439]: DEBUG nova.network.neutron [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Successfully updated port: 69860ca7-3239-4872-9b6e-0c087819b941 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1266.704597] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "refresh_cache-ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1266.704597] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "refresh_cache-ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1266.704597] env[61439]: DEBUG nova.network.neutron [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1266.790405] env[61439]: DEBUG nova.network.neutron [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1267.003432] env[61439]: DEBUG nova.network.neutron [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Successfully created port: 13892b1e-9139-4711-927c-10e97b557204 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1267.114824] env[61439]: DEBUG nova.network.neutron [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Updating instance_info_cache with network_info: [{"id": "69860ca7-3239-4872-9b6e-0c087819b941", "address": "fa:16:3e:ff:1b:9f", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap69860ca7-32", "ovs_interfaceid": "69860ca7-3239-4872-9b6e-0c087819b941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1267.129030] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "refresh_cache-ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1267.129030] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Instance network_info: |[{"id": "69860ca7-3239-4872-9b6e-0c087819b941", "address": "fa:16:3e:ff:1b:9f", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap69860ca7-32", "ovs_interfaceid": "69860ca7-3239-4872-9b6e-0c087819b941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1267.129030] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ff:1b:9f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '779b8e65-8b9e-427e-af08-910febd65bfa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '69860ca7-3239-4872-9b6e-0c087819b941', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1267.136167] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating folder: Project (e839303682f748a4b5a42c8a9273e388). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1267.136866] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-332b8f32-e755-4795-a0e1-abef1198278f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.149677] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created folder: Project (e839303682f748a4b5a42c8a9273e388) in parent group-v221281. [ 1267.150181] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating folder: Instances. Parent ref: group-v221335. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1267.150646] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fb96aa50-f380-445c-a5f6-aa994cc0f2ae {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.160436] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created folder: Instances in parent group-v221335. [ 1267.160827] env[61439]: DEBUG oslo.service.loopingcall [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1267.161145] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1267.161478] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0c9923c3-3455-4ad4-a855-9f8c9dd5d809 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.180723] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1267.180723] env[61439]: value = "task-987744" [ 1267.180723] env[61439]: _type = "Task" [ 1267.180723] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1267.188085] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987744, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1267.606313] env[61439]: DEBUG nova.network.neutron [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Successfully created port: 7c94638b-a68c-4955-86a7-efdcca8ed57c {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1267.691711] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987744, 'name': CreateVM_Task, 'duration_secs': 0.299463} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1267.692703] env[61439]: DEBUG nova.network.neutron [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Successfully updated port: 13892b1e-9139-4711-927c-10e97b557204 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1267.693775] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1267.694474] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1267.694651] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1267.695198] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1267.695342] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac93c960-394f-4d2d-8b1b-08ab2f2edaf3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.703382] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1267.703382] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5204127a-fc0f-6204-15f2-ce941002ad63" [ 1267.703382] env[61439]: _type = "Task" [ 1267.703382] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1267.707744] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "refresh_cache-5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1267.707882] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquired lock "refresh_cache-5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1267.708039] env[61439]: DEBUG nova.network.neutron [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1267.716754] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5204127a-fc0f-6204-15f2-ce941002ad63, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1267.752693] env[61439]: DEBUG nova.network.neutron [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1267.962501] env[61439]: DEBUG nova.network.neutron [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Updating instance_info_cache with network_info: [{"id": "13892b1e-9139-4711-927c-10e97b557204", "address": "fa:16:3e:19:f6:ec", "network": {"id": "d3f8bd58-1789-442b-917b-96529b971366", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1504553637-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32ed96e3b417472da91886ea192c588b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13892b1e-91", "ovs_interfaceid": "13892b1e-9139-4711-927c-10e97b557204", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1267.976265] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Releasing lock "refresh_cache-5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1267.976578] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Instance network_info: |[{"id": "13892b1e-9139-4711-927c-10e97b557204", "address": "fa:16:3e:19:f6:ec", "network": {"id": "d3f8bd58-1789-442b-917b-96529b971366", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1504553637-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32ed96e3b417472da91886ea192c588b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13892b1e-91", "ovs_interfaceid": "13892b1e-9139-4711-927c-10e97b557204", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1267.976958] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:19:f6:ec', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69744f59-ecac-4b0b-831e-82a274d7acbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '13892b1e-9139-4711-927c-10e97b557204', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1267.984442] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Creating folder: Project (32ed96e3b417472da91886ea192c588b). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1267.984976] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-26b5efb1-bde5-4f94-baea-27b8a9c13f46 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.996531] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Created folder: Project (32ed96e3b417472da91886ea192c588b) in parent group-v221281. [ 1267.996720] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Creating folder: Instances. Parent ref: group-v221338. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1267.996943] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-46e4ca7d-f8f5-4f85-b97f-a29e290a44b6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.005323] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Created folder: Instances in parent group-v221338. [ 1268.005542] env[61439]: DEBUG oslo.service.loopingcall [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1268.005721] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1268.005906] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-170daadf-a68f-482c-ab57-ada03a785233 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.024877] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1268.024877] env[61439]: value = "task-987747" [ 1268.024877] env[61439]: _type = "Task" [ 1268.024877] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1268.032208] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987747, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1268.121448] env[61439]: DEBUG nova.compute.manager [req-95fee27a-d0ae-430f-8b52-5ff64008b8b5 req-23ad1e1b-9028-4f2d-a6f6-0cc844797eaa service nova] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Received event network-vif-plugged-7c94638b-a68c-4955-86a7-efdcca8ed57c {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1268.121681] env[61439]: DEBUG oslo_concurrency.lockutils [req-95fee27a-d0ae-430f-8b52-5ff64008b8b5 req-23ad1e1b-9028-4f2d-a6f6-0cc844797eaa service nova] Acquiring lock "4abe5722-e83b-4c40-9b82-ca84545496c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.121889] env[61439]: DEBUG oslo_concurrency.lockutils [req-95fee27a-d0ae-430f-8b52-5ff64008b8b5 req-23ad1e1b-9028-4f2d-a6f6-0cc844797eaa service nova] Lock "4abe5722-e83b-4c40-9b82-ca84545496c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.122101] env[61439]: DEBUG oslo_concurrency.lockutils [req-95fee27a-d0ae-430f-8b52-5ff64008b8b5 req-23ad1e1b-9028-4f2d-a6f6-0cc844797eaa service nova] Lock "4abe5722-e83b-4c40-9b82-ca84545496c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.122286] env[61439]: DEBUG nova.compute.manager [req-95fee27a-d0ae-430f-8b52-5ff64008b8b5 req-23ad1e1b-9028-4f2d-a6f6-0cc844797eaa service nova] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] No waiting events found dispatching network-vif-plugged-7c94638b-a68c-4955-86a7-efdcca8ed57c {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1268.122475] env[61439]: WARNING nova.compute.manager [req-95fee27a-d0ae-430f-8b52-5ff64008b8b5 req-23ad1e1b-9028-4f2d-a6f6-0cc844797eaa service nova] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Received unexpected event network-vif-plugged-7c94638b-a68c-4955-86a7-efdcca8ed57c for instance with vm_state building and task_state spawning. [ 1268.166467] env[61439]: DEBUG nova.network.neutron [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Successfully updated port: 7c94638b-a68c-4955-86a7-efdcca8ed57c {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1268.177470] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "refresh_cache-4abe5722-e83b-4c40-9b82-ca84545496c8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.177625] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquired lock "refresh_cache-4abe5722-e83b-4c40-9b82-ca84545496c8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.177778] env[61439]: DEBUG nova.network.neutron [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1268.214788] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1268.215120] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1268.215357] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.219625] env[61439]: DEBUG nova.network.neutron [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1268.322665] env[61439]: DEBUG nova.compute.manager [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Received event network-changed-69860ca7-3239-4872-9b6e-0c087819b941 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1268.322979] env[61439]: DEBUG nova.compute.manager [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Refreshing instance network info cache due to event network-changed-69860ca7-3239-4872-9b6e-0c087819b941. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1268.323108] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Acquiring lock "refresh_cache-ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.323257] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Acquired lock "refresh_cache-ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.323851] env[61439]: DEBUG nova.network.neutron [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Refreshing network info cache for port 69860ca7-3239-4872-9b6e-0c087819b941 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1268.379677] env[61439]: DEBUG nova.network.neutron [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Updating instance_info_cache with network_info: [{"id": "7c94638b-a68c-4955-86a7-efdcca8ed57c", "address": "fa:16:3e:68:b4:7e", "network": {"id": "d3f8bd58-1789-442b-917b-96529b971366", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1504553637-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32ed96e3b417472da91886ea192c588b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c94638b-a6", "ovs_interfaceid": "7c94638b-a68c-4955-86a7-efdcca8ed57c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.393056] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Releasing lock "refresh_cache-4abe5722-e83b-4c40-9b82-ca84545496c8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1268.393288] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Instance network_info: |[{"id": "7c94638b-a68c-4955-86a7-efdcca8ed57c", "address": "fa:16:3e:68:b4:7e", "network": {"id": "d3f8bd58-1789-442b-917b-96529b971366", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1504553637-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32ed96e3b417472da91886ea192c588b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c94638b-a6", "ovs_interfaceid": "7c94638b-a68c-4955-86a7-efdcca8ed57c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1268.394028] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:68:b4:7e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69744f59-ecac-4b0b-831e-82a274d7acbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7c94638b-a68c-4955-86a7-efdcca8ed57c', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1268.401204] env[61439]: DEBUG oslo.service.loopingcall [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1268.401647] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1268.401889] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e666e8c1-443c-4840-bc15-6b9b91bc2b37 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.423888] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1268.423888] env[61439]: value = "task-987748" [ 1268.423888] env[61439]: _type = "Task" [ 1268.423888] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1268.432159] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987748, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1268.534421] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987747, 'name': CreateVM_Task, 'duration_secs': 0.285917} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1268.534578] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1268.535270] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.535436] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.535745] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1268.535983] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c7356a0b-510d-4665-95c7-0b56493d50de {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.540935] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for the task: (returnval){ [ 1268.540935] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]524d17b6-5d7b-ac01-2651-c5152664e029" [ 1268.540935] env[61439]: _type = "Task" [ 1268.540935] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1268.548944] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]524d17b6-5d7b-ac01-2651-c5152664e029, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1268.659432] env[61439]: DEBUG nova.network.neutron [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Updated VIF entry in instance network info cache for port 69860ca7-3239-4872-9b6e-0c087819b941. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1268.659802] env[61439]: DEBUG nova.network.neutron [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Updating instance_info_cache with network_info: [{"id": "69860ca7-3239-4872-9b6e-0c087819b941", "address": "fa:16:3e:ff:1b:9f", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap69860ca7-32", "ovs_interfaceid": "69860ca7-3239-4872-9b6e-0c087819b941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.669354] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Releasing lock "refresh_cache-ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1268.669606] env[61439]: DEBUG nova.compute.manager [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Received event network-vif-plugged-13892b1e-9139-4711-927c-10e97b557204 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1268.669819] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Acquiring lock "5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.670048] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Lock "5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.670233] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Lock "5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.670483] env[61439]: DEBUG nova.compute.manager [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] No waiting events found dispatching network-vif-plugged-13892b1e-9139-4711-927c-10e97b557204 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1268.670674] env[61439]: WARNING nova.compute.manager [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Received unexpected event network-vif-plugged-13892b1e-9139-4711-927c-10e97b557204 for instance with vm_state building and task_state spawning. [ 1268.670852] env[61439]: DEBUG nova.compute.manager [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Received event network-changed-13892b1e-9139-4711-927c-10e97b557204 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1268.671038] env[61439]: DEBUG nova.compute.manager [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Refreshing instance network info cache due to event network-changed-13892b1e-9139-4711-927c-10e97b557204. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1268.671253] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Acquiring lock "refresh_cache-5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.671399] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Acquired lock "refresh_cache-5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.671599] env[61439]: DEBUG nova.network.neutron [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Refreshing network info cache for port 13892b1e-9139-4711-927c-10e97b557204 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1268.905822] env[61439]: DEBUG nova.network.neutron [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Updated VIF entry in instance network info cache for port 13892b1e-9139-4711-927c-10e97b557204. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1268.906344] env[61439]: DEBUG nova.network.neutron [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Updating instance_info_cache with network_info: [{"id": "13892b1e-9139-4711-927c-10e97b557204", "address": "fa:16:3e:19:f6:ec", "network": {"id": "d3f8bd58-1789-442b-917b-96529b971366", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1504553637-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32ed96e3b417472da91886ea192c588b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13892b1e-91", "ovs_interfaceid": "13892b1e-9139-4711-927c-10e97b557204", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.915783] env[61439]: DEBUG oslo_concurrency.lockutils [req-02785594-7dda-4ed6-a8ba-3c52eaa01a2c req-79dca767-8a1f-4f49-aade-2ea2279160ef service nova] Releasing lock "refresh_cache-5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1268.936894] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987748, 'name': CreateVM_Task, 'duration_secs': 0.298036} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1268.937119] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1268.937909] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1269.051670] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1269.051924] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1269.052147] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1269.052361] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1269.052694] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1269.052945] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5dcd1ed2-0c51-4252-aee1-1706f4ee849c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.057217] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for the task: (returnval){ [ 1269.057217] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52ccc319-9c2b-a112-4cb9-9c7e1bad8a3f" [ 1269.057217] env[61439]: _type = "Task" [ 1269.057217] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1269.064557] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52ccc319-9c2b-a112-4cb9-9c7e1bad8a3f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1269.567284] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1269.567535] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1269.567747] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1270.146886] env[61439]: DEBUG nova.compute.manager [req-77064a4a-2087-4cb2-8d3c-cdaae62d00cf req-d4a7727c-64bb-4d8b-90e6-97cfa413ec05 service nova] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Received event network-changed-7c94638b-a68c-4955-86a7-efdcca8ed57c {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1270.147132] env[61439]: DEBUG nova.compute.manager [req-77064a4a-2087-4cb2-8d3c-cdaae62d00cf req-d4a7727c-64bb-4d8b-90e6-97cfa413ec05 service nova] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Refreshing instance network info cache due to event network-changed-7c94638b-a68c-4955-86a7-efdcca8ed57c. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1270.147301] env[61439]: DEBUG oslo_concurrency.lockutils [req-77064a4a-2087-4cb2-8d3c-cdaae62d00cf req-d4a7727c-64bb-4d8b-90e6-97cfa413ec05 service nova] Acquiring lock "refresh_cache-4abe5722-e83b-4c40-9b82-ca84545496c8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1270.147448] env[61439]: DEBUG oslo_concurrency.lockutils [req-77064a4a-2087-4cb2-8d3c-cdaae62d00cf req-d4a7727c-64bb-4d8b-90e6-97cfa413ec05 service nova] Acquired lock "refresh_cache-4abe5722-e83b-4c40-9b82-ca84545496c8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1270.147608] env[61439]: DEBUG nova.network.neutron [req-77064a4a-2087-4cb2-8d3c-cdaae62d00cf req-d4a7727c-64bb-4d8b-90e6-97cfa413ec05 service nova] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Refreshing network info cache for port 7c94638b-a68c-4955-86a7-efdcca8ed57c {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1270.482642] env[61439]: DEBUG nova.network.neutron [req-77064a4a-2087-4cb2-8d3c-cdaae62d00cf req-d4a7727c-64bb-4d8b-90e6-97cfa413ec05 service nova] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Updated VIF entry in instance network info cache for port 7c94638b-a68c-4955-86a7-efdcca8ed57c. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1270.483028] env[61439]: DEBUG nova.network.neutron [req-77064a4a-2087-4cb2-8d3c-cdaae62d00cf req-d4a7727c-64bb-4d8b-90e6-97cfa413ec05 service nova] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Updating instance_info_cache with network_info: [{"id": "7c94638b-a68c-4955-86a7-efdcca8ed57c", "address": "fa:16:3e:68:b4:7e", "network": {"id": "d3f8bd58-1789-442b-917b-96529b971366", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1504553637-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32ed96e3b417472da91886ea192c588b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c94638b-a6", "ovs_interfaceid": "7c94638b-a68c-4955-86a7-efdcca8ed57c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1270.491645] env[61439]: DEBUG oslo_concurrency.lockutils [req-77064a4a-2087-4cb2-8d3c-cdaae62d00cf req-d4a7727c-64bb-4d8b-90e6-97cfa413ec05 service nova] Releasing lock "refresh_cache-4abe5722-e83b-4c40-9b82-ca84545496c8" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1277.449888] env[61439]: WARNING oslo_vmware.rw_handles [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1277.449888] env[61439]: ERROR oslo_vmware.rw_handles [ 1277.450599] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1277.452328] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1277.452620] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Copying Virtual Disk [datastore2] vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/fd2eb784-8252-4a5b-9533-f976bd7ccf0a/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1277.452913] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4f5a33b8-b627-4fe5-a486-21e9f68b4f77 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.460997] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Waiting for the task: (returnval){ [ 1277.460997] env[61439]: value = "task-987749" [ 1277.460997] env[61439]: _type = "Task" [ 1277.460997] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1277.468652] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Task: {'id': task-987749, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1277.970991] env[61439]: DEBUG oslo_vmware.exceptions [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1277.971303] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1277.971893] env[61439]: ERROR nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1277.971893] env[61439]: Faults: ['InvalidArgument'] [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Traceback (most recent call last): [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] yield resources [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] self.driver.spawn(context, instance, image_meta, [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] self._fetch_image_if_missing(context, vi) [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] image_cache(vi, tmp_image_ds_loc) [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] vm_util.copy_virtual_disk( [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] session._wait_for_task(vmdk_copy_task) [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] return self.wait_for_task(task_ref) [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] return evt.wait() [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] result = hub.switch() [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] return self.greenlet.switch() [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] self.f(*self.args, **self.kw) [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] raise exceptions.translate_fault(task_info.error) [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Faults: ['InvalidArgument'] [ 1277.971893] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] [ 1277.973316] env[61439]: INFO nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Terminating instance [ 1277.973732] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1277.973938] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1277.974189] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c553bf5c-d60a-4d28-bedd-37811a73287b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.977048] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1277.977211] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1277.977960] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17ea7548-26a1-4557-b44c-1e5c85417d66 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.984995] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1277.985251] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a0cfc385-650a-48fa-9c0c-14df111e56fd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.987493] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1277.987660] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1277.988639] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99cfe1d1-d348-4ee1-97e8-72032c8ba9c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.993219] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for the task: (returnval){ [ 1277.993219] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52f00520-51a4-fd33-7789-dada785f9e08" [ 1277.993219] env[61439]: _type = "Task" [ 1277.993219] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1278.000188] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52f00520-51a4-fd33-7789-dada785f9e08, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1278.086084] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1278.086320] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1278.086488] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Deleting the datastore file [datastore2] b954e159-4d89-4c61-a5bc-5e6c67cf278c {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1278.086749] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1eb78ca9-fa86-413f-a60d-a0e64eb2f2e4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.092786] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Waiting for the task: (returnval){ [ 1278.092786] env[61439]: value = "task-987751" [ 1278.092786] env[61439]: _type = "Task" [ 1278.092786] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1278.100274] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Task: {'id': task-987751, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1278.503686] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1278.504097] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Creating directory with path [datastore2] vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1278.504166] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6cf27d43-c43f-4989-be66-8740c05180f2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.514944] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Created directory with path [datastore2] vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1278.515155] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Fetch image to [datastore2] vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1278.515329] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1278.516008] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f70a2f17-9cc8-473e-b0f3-4656ccc6a253 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.522252] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30cdcc0d-1ddc-4707-9361-1d0584cec696 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.530974] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95dd3bc2-57ef-46a0-ba22-bdb3f097fd12 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.561483] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac84d105-0567-42e6-a934-5a6be884f8aa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.566795] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-852fa435-cffe-4558-b959-8ff40357d46d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.590673] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1278.601365] env[61439]: DEBUG oslo_vmware.api [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Task: {'id': task-987751, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076436} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1278.601528] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1278.602066] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1278.602066] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1278.602168] env[61439]: INFO nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1278.604952] env[61439]: DEBUG nova.compute.claims [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1278.605178] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1278.605400] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1278.640179] env[61439]: DEBUG oslo_vmware.rw_handles [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1278.702913] env[61439]: DEBUG oslo_vmware.rw_handles [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1278.702913] env[61439]: DEBUG oslo_vmware.rw_handles [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1278.804581] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12f7abde-42f9-44a8-a7a7-4023b698dada {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.811771] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7c83f36-dc89-48d3-8b8e-ff5ff1a45c61 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.841521] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-989b2eaf-c525-4228-825a-8e7b1e5b58c2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.848548] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef63cc7e-e321-43ed-9900-b5563a88d7c6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.861240] env[61439]: DEBUG nova.compute.provider_tree [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1278.869207] env[61439]: DEBUG nova.scheduler.client.report [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1278.884655] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.279s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1278.885223] env[61439]: ERROR nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1278.885223] env[61439]: Faults: ['InvalidArgument'] [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Traceback (most recent call last): [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] self.driver.spawn(context, instance, image_meta, [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] self._fetch_image_if_missing(context, vi) [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] image_cache(vi, tmp_image_ds_loc) [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] vm_util.copy_virtual_disk( [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] session._wait_for_task(vmdk_copy_task) [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] return self.wait_for_task(task_ref) [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] return evt.wait() [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] result = hub.switch() [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] return self.greenlet.switch() [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] self.f(*self.args, **self.kw) [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] raise exceptions.translate_fault(task_info.error) [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Faults: ['InvalidArgument'] [ 1278.885223] env[61439]: ERROR nova.compute.manager [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] [ 1278.886079] env[61439]: DEBUG nova.compute.utils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1278.887458] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Build of instance b954e159-4d89-4c61-a5bc-5e6c67cf278c was re-scheduled: A specified parameter was not correct: fileType [ 1278.887458] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1278.887820] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1278.887991] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1278.888172] env[61439]: DEBUG nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1278.888338] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1279.538751] env[61439]: DEBUG nova.network.neutron [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1279.552138] env[61439]: INFO nova.compute.manager [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Took 0.66 seconds to deallocate network for instance. [ 1279.638114] env[61439]: INFO nova.scheduler.client.report [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Deleted allocations for instance b954e159-4d89-4c61-a5bc-5e6c67cf278c [ 1279.657363] env[61439]: DEBUG oslo_concurrency.lockutils [None req-cf626a5a-7b41-4975-8a7e-32f61e1bb58b tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 414.401s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1279.657641] env[61439]: DEBUG oslo_concurrency.lockutils [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 218.011s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1279.657920] env[61439]: DEBUG oslo_concurrency.lockutils [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Acquiring lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1279.658165] env[61439]: DEBUG oslo_concurrency.lockutils [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1279.658369] env[61439]: DEBUG oslo_concurrency.lockutils [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1279.660501] env[61439]: INFO nova.compute.manager [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Terminating instance [ 1279.662289] env[61439]: DEBUG nova.compute.manager [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1279.662535] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1279.663053] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-39d2233c-54cb-43a6-aa63-1fcaecf68d27 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.673271] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63f991d0-846d-434f-9311-2cbfa0fb4e26 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.698710] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b954e159-4d89-4c61-a5bc-5e6c67cf278c could not be found. [ 1279.698918] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1279.699108] env[61439]: INFO nova.compute.manager [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1279.699347] env[61439]: DEBUG oslo.service.loopingcall [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1279.699572] env[61439]: DEBUG nova.compute.manager [-] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1279.699668] env[61439]: DEBUG nova.network.neutron [-] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1279.728741] env[61439]: DEBUG nova.network.neutron [-] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1279.736787] env[61439]: INFO nova.compute.manager [-] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] Took 0.04 seconds to deallocate network for instance. [ 1279.831513] env[61439]: DEBUG oslo_concurrency.lockutils [None req-46ad05df-01c0-49aa-b345-8c3db0155aea tempest-ServersTestMultiNic-72568094 tempest-ServersTestMultiNic-72568094-project-member] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1279.832370] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 75.556s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1279.832623] env[61439]: INFO nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: b954e159-4d89-4c61-a5bc-5e6c67cf278c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1279.832807] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "b954e159-4d89-4c61-a5bc-5e6c67cf278c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1280.478780] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1317.197061] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1317.201580] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1317.213697] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1317.213908] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1317.214087] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1317.214251] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1317.215330] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62c931dc-3bb0-45f3-932c-40e5b0d2fc94 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.224064] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f6d6f5-3a09-41a7-af97-5464b8e743c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.237773] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92f77c3d-c941-47f2-a801-a57acfca6f34 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.244012] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c06a2b2-8c6f-4755-b476-d8d65d603765 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.272251] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181490MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1317.272399] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1317.272620] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1317.341686] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1317.341686] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1317.341686] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 42999bc8-a3be-4310-97ad-324c7f4fc8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1317.341686] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5403acfb-04d2-4081-80c7-23d662410a20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1317.341686] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance ba1bd6f7-6f05-4a09-a35c-6493b64feb9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1317.341914] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1317.341949] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4abe5722-e83b-4c40-9b82-ca84545496c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1317.342151] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1317.342299] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1317.432604] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-440f75b1-e403-4455-8316-78f7c50e4edd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.440680] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fdbef88-0a03-4a1a-bb0e-9ecbc382bd68 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.471246] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1664b5f7-e6b0-4e67-857c-4b181d14f202 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.478052] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89fceffd-0183-4e7a-89f7-8816d820d7bd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.490959] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1317.499760] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1317.514960] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1317.514960] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.513125] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1318.513471] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1318.513590] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1319.203072] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1319.203072] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1319.203072] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1320.204112] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1320.204112] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1320.204112] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1320.220780] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1320.220955] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1320.221066] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1320.221197] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1320.221320] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1320.221443] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1320.221565] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1320.221724] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1322.203102] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1322.219365] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1327.891138] env[61439]: WARNING oslo_vmware.rw_handles [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1327.891138] env[61439]: ERROR oslo_vmware.rw_handles [ 1327.891138] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1327.893739] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1327.894953] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Copying Virtual Disk [datastore2] vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/fb51c790-8044-4e58-9ab1-60d2239abdc3/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1327.894953] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-aa7e5e4a-08e5-4f6f-9cd5-20bd432698ad {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.903795] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for the task: (returnval){ [ 1327.903795] env[61439]: value = "task-987752" [ 1327.903795] env[61439]: _type = "Task" [ 1327.903795] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1327.911939] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': task-987752, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1328.415659] env[61439]: DEBUG oslo_vmware.exceptions [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1328.415924] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1328.416521] env[61439]: ERROR nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1328.416521] env[61439]: Faults: ['InvalidArgument'] [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Traceback (most recent call last): [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] yield resources [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] self.driver.spawn(context, instance, image_meta, [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] self._fetch_image_if_missing(context, vi) [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] image_cache(vi, tmp_image_ds_loc) [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] vm_util.copy_virtual_disk( [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] session._wait_for_task(vmdk_copy_task) [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] return self.wait_for_task(task_ref) [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] return evt.wait() [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] result = hub.switch() [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] return self.greenlet.switch() [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] self.f(*self.args, **self.kw) [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] raise exceptions.translate_fault(task_info.error) [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Faults: ['InvalidArgument'] [ 1328.416521] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] [ 1328.417538] env[61439]: INFO nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Terminating instance [ 1328.418920] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1328.418920] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1328.418920] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d6c63dee-12eb-4d7c-9b86-d295b6f90660 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.421221] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1328.421431] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1328.422162] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16345c7a-949f-4620-952c-ca0fb891752e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.429181] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1328.429418] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3b0778cc-d4bc-4938-8e09-4e60f52a2cee {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.431689] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1328.431870] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1328.432830] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-54e7c5e3-971b-478f-ab42-c275a979eb32 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.437576] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Waiting for the task: (returnval){ [ 1328.437576] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]526e2da3-b531-6799-202e-67195c9bd1db" [ 1328.437576] env[61439]: _type = "Task" [ 1328.437576] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1328.444848] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]526e2da3-b531-6799-202e-67195c9bd1db, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1328.506925] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1328.506925] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1328.507161] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Deleting the datastore file [datastore2] aeeb7c6c-7413-46b0-8632-c7224620e9b2 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1328.507304] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d5f37209-5829-4cf0-bb57-907b477a93bc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.513960] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for the task: (returnval){ [ 1328.513960] env[61439]: value = "task-987754" [ 1328.513960] env[61439]: _type = "Task" [ 1328.513960] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1328.522171] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': task-987754, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1328.947326] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1328.947601] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Creating directory with path [datastore2] vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1328.947824] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2415ac78-0718-4a86-986a-feba5c0dedc1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.958737] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Created directory with path [datastore2] vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1328.958935] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Fetch image to [datastore2] vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1328.959125] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1328.959849] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7713fd0-4a35-4aec-bb5a-5bce77c03026 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.966307] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53a11753-5902-4e7e-94e9-476f8fdabfe6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.975257] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab508850-ab48-47fe-bdfd-9c54c6af8eab {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.005422] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e16ca2a-a743-46b3-a3cd-2ee60fb73b22 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.011132] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-62c7eae1-b23a-4173-97ab-e22f855c603d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.022789] env[61439]: DEBUG oslo_vmware.api [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Task: {'id': task-987754, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076727} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1329.023109] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1329.023325] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1329.023556] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1329.023725] env[61439]: INFO nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1329.025825] env[61439]: DEBUG nova.compute.claims [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1329.026016] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1329.026339] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1329.033072] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1329.107875] env[61439]: DEBUG oslo_vmware.rw_handles [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1329.168235] env[61439]: DEBUG oslo_vmware.rw_handles [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1329.168428] env[61439]: DEBUG oslo_vmware.rw_handles [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1329.227614] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a8ac2a1-c4c9-4713-916e-a3e60b44eced {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.235019] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db8b8c3b-cd87-4a95-a623-29c5e35697d1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.264977] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-802a5970-7a75-4350-bb37-1429e32aa71b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.271549] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bab9a6ed-a045-41f4-a1a5-49c295726771 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.284616] env[61439]: DEBUG nova.compute.provider_tree [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1329.292694] env[61439]: DEBUG nova.scheduler.client.report [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1329.307159] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.281s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1329.307681] env[61439]: ERROR nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1329.307681] env[61439]: Faults: ['InvalidArgument'] [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Traceback (most recent call last): [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] self.driver.spawn(context, instance, image_meta, [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] self._fetch_image_if_missing(context, vi) [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] image_cache(vi, tmp_image_ds_loc) [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] vm_util.copy_virtual_disk( [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] session._wait_for_task(vmdk_copy_task) [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] return self.wait_for_task(task_ref) [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] return evt.wait() [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] result = hub.switch() [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] return self.greenlet.switch() [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] self.f(*self.args, **self.kw) [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] raise exceptions.translate_fault(task_info.error) [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Faults: ['InvalidArgument'] [ 1329.307681] env[61439]: ERROR nova.compute.manager [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] [ 1329.308517] env[61439]: DEBUG nova.compute.utils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1329.309835] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Build of instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 was re-scheduled: A specified parameter was not correct: fileType [ 1329.309835] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1329.310220] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1329.310403] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1329.310598] env[61439]: DEBUG nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1329.310753] env[61439]: DEBUG nova.network.neutron [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1329.570185] env[61439]: DEBUG nova.network.neutron [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1329.586026] env[61439]: INFO nova.compute.manager [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Took 0.28 seconds to deallocate network for instance. [ 1329.685104] env[61439]: INFO nova.scheduler.client.report [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Deleted allocations for instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 [ 1329.710117] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8d2d9ebc-cb54-427f-bf54-f83a78f24582 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 461.010s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1329.710545] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 264.966s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1329.710803] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Acquiring lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1329.711105] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1329.711230] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1329.713871] env[61439]: INFO nova.compute.manager [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Terminating instance [ 1329.716290] env[61439]: DEBUG nova.compute.manager [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1329.716290] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1329.716375] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d78596e6-a172-4660-b2d7-b9b8833b7572 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.725547] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29cb00a3-15b5-477b-956f-36d5390b2d93 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.751913] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aeeb7c6c-7413-46b0-8632-c7224620e9b2 could not be found. [ 1329.752152] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1329.752344] env[61439]: INFO nova.compute.manager [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1329.752651] env[61439]: DEBUG oslo.service.loopingcall [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1329.752887] env[61439]: DEBUG nova.compute.manager [-] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1329.753090] env[61439]: DEBUG nova.network.neutron [-] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1329.778660] env[61439]: DEBUG nova.network.neutron [-] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1329.786924] env[61439]: INFO nova.compute.manager [-] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] Took 0.03 seconds to deallocate network for instance. [ 1329.875873] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c216e848-0514-449f-be57-d4ecaef3cb36 tempest-ListImageFiltersTestJSON-1130217809 tempest-ListImageFiltersTestJSON-1130217809-project-member] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.165s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1329.877647] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 125.600s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1329.877647] env[61439]: INFO nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: aeeb7c6c-7413-46b0-8632-c7224620e9b2] During sync_power_state the instance has a pending task (deleting). Skip. [ 1329.877647] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "aeeb7c6c-7413-46b0-8632-c7224620e9b2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1330.280922] env[61439]: DEBUG oslo_concurrency.lockutils [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquiring lock "5403acfb-04d2-4081-80c7-23d662410a20" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1378.795808] env[61439]: WARNING oslo_vmware.rw_handles [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1378.795808] env[61439]: ERROR oslo_vmware.rw_handles [ 1378.796692] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1378.798516] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1378.798763] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Copying Virtual Disk [datastore2] vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/c623215c-cde0-45f0-92f9-4957d631ec11/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1378.799052] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-afdecb37-4afe-4633-a355-2c1666b80ded {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.807422] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Waiting for the task: (returnval){ [ 1378.807422] env[61439]: value = "task-987755" [ 1378.807422] env[61439]: _type = "Task" [ 1378.807422] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1378.815200] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Task: {'id': task-987755, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1379.202359] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1379.202632] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1379.202790] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1379.202952] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1379.203116] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1379.203274] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1379.217821] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1379.218056] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1379.218230] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1379.218392] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1379.219471] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a41379e-2105-4628-b048-25bc2c2b519b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.228045] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8ba290b-79bf-4219-a789-f0302c161973 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.242878] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1d9f473-58ee-47e6-8802-f4bfba5dbfcb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.249044] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f56c20a-51ba-4763-8e1d-0374756efe85 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.277280] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181527MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1379.277421] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1379.277612] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1379.318654] env[61439]: DEBUG oslo_vmware.exceptions [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1379.321535] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1379.322034] env[61439]: ERROR nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1379.322034] env[61439]: Faults: ['InvalidArgument'] [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Traceback (most recent call last): [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] yield resources [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] self.driver.spawn(context, instance, image_meta, [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] self._fetch_image_if_missing(context, vi) [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] image_cache(vi, tmp_image_ds_loc) [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] vm_util.copy_virtual_disk( [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] session._wait_for_task(vmdk_copy_task) [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] return self.wait_for_task(task_ref) [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] return evt.wait() [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] result = hub.switch() [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] return self.greenlet.switch() [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] self.f(*self.args, **self.kw) [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] raise exceptions.translate_fault(task_info.error) [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Faults: ['InvalidArgument'] [ 1379.322034] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] [ 1379.323455] env[61439]: INFO nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Terminating instance [ 1379.324096] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1379.324316] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1379.325680] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1379.325680] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1379.325680] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-36446793-07f2-4c84-a86d-a19706d787db {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.327829] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12531bd1-370a-481e-83a7-88bd381a3634 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.334858] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1379.335083] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fe5478d9-63f6-4ef9-9f53-ea3414074906 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.337431] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1379.338078] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1379.338705] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fc84e5cb-effd-4a53-a16a-33b96de4483a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.343668] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for the task: (returnval){ [ 1379.343668] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5230bfc4-22be-0103-17e7-2eb47ca46533" [ 1379.343668] env[61439]: _type = "Task" [ 1379.343668] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1379.344622] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1379.344706] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 42999bc8-a3be-4310-97ad-324c7f4fc8d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1379.344840] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5403acfb-04d2-4081-80c7-23d662410a20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1379.344999] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance ba1bd6f7-6f05-4a09-a35c-6493b64feb9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1379.345314] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1379.345314] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4abe5722-e83b-4c40-9b82-ca84545496c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1379.345636] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1379.345636] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1379.355432] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5230bfc4-22be-0103-17e7-2eb47ca46533, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1379.414310] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1379.414422] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1379.414602] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Deleting the datastore file [datastore2] 7b48729f-86a7-4b53-ad11-ef8a929ec947 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1379.414864] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-078f81ce-7a37-4022-b58f-ed49cbd0f330 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.420488] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Waiting for the task: (returnval){ [ 1379.420488] env[61439]: value = "task-987757" [ 1379.420488] env[61439]: _type = "Task" [ 1379.420488] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1379.430427] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Task: {'id': task-987757, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1379.437546] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fde0b3f-65df-427b-834c-01a56abb67b6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.444138] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7852ab28-6046-4784-a727-c47e192c1c76 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.475933] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e057b03-4a36-42ee-a597-5307248985ef {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.483142] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c72c197c-0641-4421-a038-6f143cdd5eca {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.496718] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1379.504837] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1379.520734] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1379.520949] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1379.857748] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1379.858149] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Creating directory with path [datastore2] vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1379.858246] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-25157136-e435-4a1a-8da4-b298f340b769 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.868843] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Created directory with path [datastore2] vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1379.869033] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Fetch image to [datastore2] vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1379.869207] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1379.869903] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83060bcd-e68f-4ffd-8db0-fbe80426e7b6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.875998] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85fbb3e1-cbdb-4fd0-9788-b3eb1fac484a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.884605] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8af95056-1a4b-461a-981a-9fa310520327 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.915104] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-950fa8fa-c577-45ca-851d-63390aa332e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.920359] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-60559964-07b0-473d-849a-8466a1e91bcb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.928927] env[61439]: DEBUG oslo_vmware.api [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Task: {'id': task-987757, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07528} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1379.929167] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1379.929345] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1379.929539] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1379.929741] env[61439]: INFO nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1379.931852] env[61439]: DEBUG nova.compute.claims [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1379.932056] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1379.932320] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1379.943311] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1379.991947] env[61439]: DEBUG oslo_vmware.rw_handles [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1380.050908] env[61439]: DEBUG oslo_vmware.rw_handles [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1380.051115] env[61439]: DEBUG oslo_vmware.rw_handles [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1380.101758] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb589792-4fd7-4d4b-8213-740249ea2b5d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1380.109197] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a1ac7ca-8dad-4666-8165-c2a498e48258 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1380.138576] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6191ed6-ac24-44b2-af6c-7c3998981d97 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1380.144747] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f91847b-9986-47d1-a04b-1be88110fa12 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1380.157324] env[61439]: DEBUG nova.compute.provider_tree [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1380.167017] env[61439]: DEBUG nova.scheduler.client.report [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1380.179639] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.247s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.180162] env[61439]: ERROR nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1380.180162] env[61439]: Faults: ['InvalidArgument'] [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Traceback (most recent call last): [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] self.driver.spawn(context, instance, image_meta, [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] self._fetch_image_if_missing(context, vi) [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] image_cache(vi, tmp_image_ds_loc) [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] vm_util.copy_virtual_disk( [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] session._wait_for_task(vmdk_copy_task) [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] return self.wait_for_task(task_ref) [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] return evt.wait() [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] result = hub.switch() [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] return self.greenlet.switch() [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] self.f(*self.args, **self.kw) [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] raise exceptions.translate_fault(task_info.error) [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Faults: ['InvalidArgument'] [ 1380.180162] env[61439]: ERROR nova.compute.manager [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] [ 1380.181079] env[61439]: DEBUG nova.compute.utils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1380.182238] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Build of instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 was re-scheduled: A specified parameter was not correct: fileType [ 1380.182238] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1380.182627] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1380.182801] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1380.182959] env[61439]: DEBUG nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1380.183136] env[61439]: DEBUG nova.network.neutron [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1380.519757] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1380.520019] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1380.539431] env[61439]: DEBUG nova.network.neutron [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1380.549408] env[61439]: INFO nova.compute.manager [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Took 0.37 seconds to deallocate network for instance. [ 1380.644460] env[61439]: INFO nova.scheduler.client.report [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Deleted allocations for instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 [ 1380.666479] env[61439]: DEBUG oslo_concurrency.lockutils [None req-fe6b4a7f-112c-4b01-bf92-4abb4412760d tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 443.332s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.666858] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 248.244s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.667201] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Acquiring lock "7b48729f-86a7-4b53-ad11-ef8a929ec947-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1380.667440] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.667612] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.670430] env[61439]: INFO nova.compute.manager [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Terminating instance [ 1380.672882] env[61439]: DEBUG nova.compute.manager [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1380.673289] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1380.673716] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-22ba90a9-a125-4195-adea-37c3b3a53329 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1380.684033] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-879f974a-a401-4493-83cc-95ef889c7def {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1380.709344] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7b48729f-86a7-4b53-ad11-ef8a929ec947 could not be found. [ 1380.710093] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1380.710093] env[61439]: INFO nova.compute.manager [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1380.710093] env[61439]: DEBUG oslo.service.loopingcall [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1380.710217] env[61439]: DEBUG nova.compute.manager [-] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1380.710840] env[61439]: DEBUG nova.network.neutron [-] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1380.739686] env[61439]: DEBUG nova.network.neutron [-] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1380.750896] env[61439]: INFO nova.compute.manager [-] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] Took 0.04 seconds to deallocate network for instance. [ 1380.838566] env[61439]: DEBUG oslo_concurrency.lockutils [None req-9b37df87-4d2a-441b-99b1-cfc44f0dbe6b tempest-ServerDiagnosticsNegativeTest-937147690 tempest-ServerDiagnosticsNegativeTest-937147690-project-member] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.839884] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 176.563s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.840135] env[61439]: INFO nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7b48729f-86a7-4b53-ad11-ef8a929ec947] During sync_power_state the instance has a pending task (deleting). Skip. [ 1380.840326] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "7b48729f-86a7-4b53-ad11-ef8a929ec947" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1382.203480] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1382.203480] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1382.203480] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1382.217892] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1382.218085] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1382.218273] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1382.218444] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1382.218616] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1382.218781] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1383.202339] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1426.622622] env[61439]: WARNING oslo_vmware.rw_handles [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1426.622622] env[61439]: ERROR oslo_vmware.rw_handles [ 1426.623374] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1426.625386] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1426.625386] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Copying Virtual Disk [datastore2] vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/f57fe271-1c1c-4123-a806-a21c5260a28d/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1426.625649] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3248a9e8-a4ba-419f-920b-bd9632bf77bf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.637403] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for the task: (returnval){ [ 1426.637403] env[61439]: value = "task-987758" [ 1426.637403] env[61439]: _type = "Task" [ 1426.637403] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1426.647347] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': task-987758, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1427.154445] env[61439]: DEBUG oslo_vmware.exceptions [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1427.154896] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1427.155814] env[61439]: ERROR nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1427.155814] env[61439]: Faults: ['InvalidArgument'] [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Traceback (most recent call last): [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] yield resources [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] self.driver.spawn(context, instance, image_meta, [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] self._fetch_image_if_missing(context, vi) [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] image_cache(vi, tmp_image_ds_loc) [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] vm_util.copy_virtual_disk( [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] session._wait_for_task(vmdk_copy_task) [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] return self.wait_for_task(task_ref) [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] return evt.wait() [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] result = hub.switch() [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] return self.greenlet.switch() [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] self.f(*self.args, **self.kw) [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] raise exceptions.translate_fault(task_info.error) [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Faults: ['InvalidArgument'] [ 1427.155814] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] [ 1427.157147] env[61439]: INFO nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Terminating instance [ 1427.158490] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1427.158814] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1427.159184] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-76aa8342-7088-4aad-a911-efe5067b0f11 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.162585] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1427.162835] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1427.164006] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6ac0665-53db-416a-98f2-802e27f292bb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.173766] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1427.175289] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c85ff041-ceca-4160-ae50-dd7a93086382 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.177561] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1427.177839] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1427.178919] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7269d4e1-ecb2-4733-89bd-e4566af6f95e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.185887] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Waiting for the task: (returnval){ [ 1427.185887] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52080593-5f12-154f-7e88-d382b0482483" [ 1427.185887] env[61439]: _type = "Task" [ 1427.185887] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1427.197237] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52080593-5f12-154f-7e88-d382b0482483, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1427.625971] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1427.626333] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1427.626369] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Deleting the datastore file [datastore2] 42999bc8-a3be-4310-97ad-324c7f4fc8d4 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1427.626636] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5ecdb713-6d29-4a76-82b4-fc8f0003167e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.633310] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for the task: (returnval){ [ 1427.633310] env[61439]: value = "task-987760" [ 1427.633310] env[61439]: _type = "Task" [ 1427.633310] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1427.641065] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': task-987760, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1427.695882] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1427.696057] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Creating directory with path [datastore2] vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1427.696348] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8e4466f5-73e0-4696-a0ec-4c266af921c3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.716771] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Created directory with path [datastore2] vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1427.716989] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Fetch image to [datastore2] vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1427.717175] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1427.717956] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-427dd73d-1e5e-4133-9711-1914c3de1550 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.724956] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba1483f5-3d2e-400e-92c4-142250c3c94d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.734016] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a9cc40a-0c7c-46dc-a0ef-b4dcda127f25 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.763474] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91423607-9398-4c7e-947e-6d09f175680d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.769369] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5f97d63b-51ea-47f9-871b-03b741e84daf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.791094] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1427.838122] env[61439]: DEBUG oslo_vmware.rw_handles [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1427.897818] env[61439]: DEBUG oslo_vmware.rw_handles [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1427.898036] env[61439]: DEBUG oslo_vmware.rw_handles [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1428.144109] env[61439]: DEBUG oslo_vmware.api [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Task: {'id': task-987760, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080988} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1428.144338] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1428.144525] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1428.144736] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1428.144926] env[61439]: INFO nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Took 0.98 seconds to destroy the instance on the hypervisor. [ 1428.146967] env[61439]: DEBUG nova.compute.claims [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1428.147157] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1428.147371] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1428.254543] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af6a98cd-db1d-497b-99b8-e1256f8cee2e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.261572] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9767f58f-0ca3-447e-91f0-4bf687e2e9e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.290700] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8001bca-389f-4cbc-8d46-edaee6216c78 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.297377] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8df948a-3b02-41d2-98ca-c39a2c48cc64 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.310058] env[61439]: DEBUG nova.compute.provider_tree [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1428.318281] env[61439]: DEBUG nova.scheduler.client.report [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1428.330772] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.183s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1428.331327] env[61439]: ERROR nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1428.331327] env[61439]: Faults: ['InvalidArgument'] [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Traceback (most recent call last): [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] self.driver.spawn(context, instance, image_meta, [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] self._fetch_image_if_missing(context, vi) [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] image_cache(vi, tmp_image_ds_loc) [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] vm_util.copy_virtual_disk( [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] session._wait_for_task(vmdk_copy_task) [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] return self.wait_for_task(task_ref) [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] return evt.wait() [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] result = hub.switch() [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] return self.greenlet.switch() [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] self.f(*self.args, **self.kw) [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] raise exceptions.translate_fault(task_info.error) [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Faults: ['InvalidArgument'] [ 1428.331327] env[61439]: ERROR nova.compute.manager [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] [ 1428.332269] env[61439]: DEBUG nova.compute.utils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1428.333405] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Build of instance 42999bc8-a3be-4310-97ad-324c7f4fc8d4 was re-scheduled: A specified parameter was not correct: fileType [ 1428.333405] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1428.333819] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1428.333995] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1428.334192] env[61439]: DEBUG nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1428.334357] env[61439]: DEBUG nova.network.neutron [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1428.852831] env[61439]: DEBUG nova.network.neutron [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1428.874059] env[61439]: INFO nova.compute.manager [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Took 0.54 seconds to deallocate network for instance. [ 1428.990843] env[61439]: INFO nova.scheduler.client.report [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Deleted allocations for instance 42999bc8-a3be-4310-97ad-324c7f4fc8d4 [ 1429.014460] env[61439]: DEBUG oslo_concurrency.lockutils [None req-11275d1e-ec11-4e51-998a-8c3434f8bc46 tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 344.293s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1429.015816] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 224.738s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1429.015816] env[61439]: INFO nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] During sync_power_state the instance has a pending task (spawning). Skip. [ 1429.015816] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1429.015816] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 148.537s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1429.015816] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Acquiring lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1429.015816] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1429.015816] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1429.017782] env[61439]: INFO nova.compute.manager [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Terminating instance [ 1429.019443] env[61439]: DEBUG nova.compute.manager [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1429.019633] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1429.020099] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5e973e99-caef-4ca0-9ff6-9fce1f88b018 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.029261] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3daf41b8-602f-44e0-b951-92215708a05e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.054315] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 42999bc8-a3be-4310-97ad-324c7f4fc8d4 could not be found. [ 1429.054516] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1429.054694] env[61439]: INFO nova.compute.manager [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1429.054937] env[61439]: DEBUG oslo.service.loopingcall [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1429.055164] env[61439]: DEBUG nova.compute.manager [-] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1429.055265] env[61439]: DEBUG nova.network.neutron [-] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1429.078019] env[61439]: DEBUG nova.network.neutron [-] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1429.086589] env[61439]: INFO nova.compute.manager [-] [instance: 42999bc8-a3be-4310-97ad-324c7f4fc8d4] Took 0.03 seconds to deallocate network for instance. [ 1429.166906] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b9295aff-5c10-40b6-8d30-df80427753ca tempest-AttachVolumeShelveTestJSON-664442013 tempest-AttachVolumeShelveTestJSON-664442013-project-member] Lock "42999bc8-a3be-4310-97ad-324c7f4fc8d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.151s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1439.201951] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1439.213135] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1439.213356] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1439.213528] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1439.213689] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1439.214815] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b81125-b866-43af-b567-c196e15ed365 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.223477] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-796af2dc-cebd-4ccc-a1a8-ffed6cdb50a0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.236903] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d22a859-9271-4a74-8319-9c3198f0d1c8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.242871] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04f2b41b-a516-4809-bbdd-93d17dc4239e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.272089] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181566MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1439.272223] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1439.272414] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1439.327116] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5403acfb-04d2-4081-80c7-23d662410a20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1439.327286] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance ba1bd6f7-6f05-4a09-a35c-6493b64feb9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1439.327418] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1439.327542] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4abe5722-e83b-4c40-9b82-ca84545496c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1439.327724] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1439.327865] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1439.384207] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0da7281b-8e7f-49aa-a8c2-5a8b0d137cee {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.391655] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eabfe49f-4cab-4c4c-ae6d-4a39443570f8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.423054] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c359095-4265-4e74-a96b-9b2d1a403668 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.429714] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d710a60c-3eb2-4bda-94aa-ced0dbc7a3a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1439.442337] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1439.450360] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1439.464135] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1439.464311] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1440.459434] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1440.459814] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1440.459814] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1440.459972] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1440.460124] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1441.202500] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1442.202654] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1443.197834] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1443.212203] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1443.212513] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1443.212513] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1443.224777] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1443.224931] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1443.225075] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1443.225206] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1443.225330] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1443.225739] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1459.586856] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1461.733097] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "4abe5722-e83b-4c40-9b82-ca84545496c8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1477.517948] env[61439]: WARNING oslo_vmware.rw_handles [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1477.517948] env[61439]: ERROR oslo_vmware.rw_handles [ 1477.518647] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1477.520660] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1477.520929] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Copying Virtual Disk [datastore2] vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/e9ba4e42-7f04-46b5-9da3-79fa8f947b8e/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1477.521244] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ed4d3d4e-6b68-4f5a-a30d-ce30e7eeafc6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1477.528438] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Waiting for the task: (returnval){ [ 1477.528438] env[61439]: value = "task-987761" [ 1477.528438] env[61439]: _type = "Task" [ 1477.528438] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1477.536020] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Task: {'id': task-987761, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1478.039082] env[61439]: DEBUG oslo_vmware.exceptions [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1478.039360] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1478.039921] env[61439]: ERROR nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1478.039921] env[61439]: Faults: ['InvalidArgument'] [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Traceback (most recent call last): [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] yield resources [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] self.driver.spawn(context, instance, image_meta, [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] self._fetch_image_if_missing(context, vi) [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] image_cache(vi, tmp_image_ds_loc) [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] vm_util.copy_virtual_disk( [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] session._wait_for_task(vmdk_copy_task) [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] return self.wait_for_task(task_ref) [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] return evt.wait() [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] result = hub.switch() [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] return self.greenlet.switch() [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] self.f(*self.args, **self.kw) [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] raise exceptions.translate_fault(task_info.error) [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Faults: ['InvalidArgument'] [ 1478.039921] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] [ 1478.041114] env[61439]: INFO nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Terminating instance [ 1478.041705] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1478.041911] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1478.042152] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0d9ed288-7704-451b-8dde-ef0bf94a1c2b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.044236] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1478.044423] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1478.045116] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be0d49cd-5cb7-481a-b658-9910bf695fb6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.051886] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1478.052132] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d58425ad-37a6-4202-bc1e-82ff762ce06e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.054289] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1478.054456] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1478.055416] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9988c22f-d122-4bc6-a635-09dd35da97ef {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.060127] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1478.060127] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5289aa8f-8d1c-b980-6819-f4ac6e4c1591" [ 1478.060127] env[61439]: _type = "Task" [ 1478.060127] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1478.068220] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5289aa8f-8d1c-b980-6819-f4ac6e4c1591, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1478.121954] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1478.122166] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1478.122351] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Deleting the datastore file [datastore2] 5403acfb-04d2-4081-80c7-23d662410a20 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1478.122640] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-057b7493-cb59-4f99-ad7c-a8e96fc55296 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.129126] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Waiting for the task: (returnval){ [ 1478.129126] env[61439]: value = "task-987763" [ 1478.129126] env[61439]: _type = "Task" [ 1478.129126] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1478.136421] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Task: {'id': task-987763, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1478.570889] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1478.571246] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1478.571391] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-47bafc6e-fea3-4ed1-af32-969da7cdfbf1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.581870] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1478.582072] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Fetch image to [datastore2] vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1478.582247] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1478.582976] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2f8488d-66cd-4148-a905-9e6c5ae0d81f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.589201] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6de4a21-7697-46a3-8e01-b7ce9d23eb8a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.598090] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71babec6-880a-4728-9373-ea1aba2b329d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.627844] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9868a241-fb01-43a4-8b72-c2fdd369afba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.638609] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-eb7db5a9-3110-42bf-b7f3-ab519ec9bbf3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.640211] env[61439]: DEBUG oslo_vmware.api [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Task: {'id': task-987763, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069762} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1478.640467] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1478.640628] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1478.640962] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1478.641162] env[61439]: INFO nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1478.643187] env[61439]: DEBUG nova.compute.claims [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1478.643376] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1478.643610] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1478.660132] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1478.733105] env[61439]: DEBUG oslo_vmware.rw_handles [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1478.798353] env[61439]: DEBUG oslo_vmware.rw_handles [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1478.798353] env[61439]: DEBUG oslo_vmware.rw_handles [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1478.831127] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b01e829-100e-4abc-8601-ca32f816bc68 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.838592] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e69c3ca5-5a80-4837-a77a-9435bb6c1014 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.869479] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf40340f-2555-4a82-8105-90e2fc2743bf {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.876510] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1096b1e-cb4a-4532-a951-222bcd78c2e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.889611] env[61439]: DEBUG nova.compute.provider_tree [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1478.900804] env[61439]: DEBUG nova.scheduler.client.report [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1478.915578] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.272s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1478.916097] env[61439]: ERROR nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1478.916097] env[61439]: Faults: ['InvalidArgument'] [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Traceback (most recent call last): [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] self.driver.spawn(context, instance, image_meta, [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] self._fetch_image_if_missing(context, vi) [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] image_cache(vi, tmp_image_ds_loc) [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] vm_util.copy_virtual_disk( [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] session._wait_for_task(vmdk_copy_task) [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] return self.wait_for_task(task_ref) [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] return evt.wait() [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] result = hub.switch() [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] return self.greenlet.switch() [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] self.f(*self.args, **self.kw) [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] raise exceptions.translate_fault(task_info.error) [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Faults: ['InvalidArgument'] [ 1478.916097] env[61439]: ERROR nova.compute.manager [instance: 5403acfb-04d2-4081-80c7-23d662410a20] [ 1478.917423] env[61439]: DEBUG nova.compute.utils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1478.918601] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Build of instance 5403acfb-04d2-4081-80c7-23d662410a20 was re-scheduled: A specified parameter was not correct: fileType [ 1478.918601] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1478.918984] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1478.919174] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1478.919351] env[61439]: DEBUG nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1478.919515] env[61439]: DEBUG nova.network.neutron [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1479.227448] env[61439]: DEBUG nova.network.neutron [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1479.247023] env[61439]: INFO nova.compute.manager [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Took 0.33 seconds to deallocate network for instance. [ 1479.330043] env[61439]: INFO nova.scheduler.client.report [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Deleted allocations for instance 5403acfb-04d2-4081-80c7-23d662410a20 [ 1479.350498] env[61439]: DEBUG oslo_concurrency.lockutils [None req-ce69b387-7179-4a79-8a3e-67ee1f5a2407 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "5403acfb-04d2-4081-80c7-23d662410a20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 344.905s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1479.350771] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "5403acfb-04d2-4081-80c7-23d662410a20" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 275.073s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1479.350964] env[61439]: INFO nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] During sync_power_state the instance has a pending task (spawning). Skip. [ 1479.351153] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "5403acfb-04d2-4081-80c7-23d662410a20" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1479.351654] env[61439]: DEBUG oslo_concurrency.lockutils [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "5403acfb-04d2-4081-80c7-23d662410a20" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 149.071s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1479.351889] env[61439]: DEBUG oslo_concurrency.lockutils [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Acquiring lock "5403acfb-04d2-4081-80c7-23d662410a20-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1479.352120] env[61439]: DEBUG oslo_concurrency.lockutils [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "5403acfb-04d2-4081-80c7-23d662410a20-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1479.352324] env[61439]: DEBUG oslo_concurrency.lockutils [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "5403acfb-04d2-4081-80c7-23d662410a20-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1479.354287] env[61439]: INFO nova.compute.manager [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Terminating instance [ 1479.356036] env[61439]: DEBUG nova.compute.manager [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1479.356238] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1479.356727] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1d4548b9-003f-4dbd-9a37-e60fe321948f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.366816] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74ab407b-bc5d-4e9b-9b55-e39a08af7e96 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.390359] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5403acfb-04d2-4081-80c7-23d662410a20 could not be found. [ 1479.390583] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1479.390768] env[61439]: INFO nova.compute.manager [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Took 0.03 seconds to destroy the instance on the hypervisor. [ 1479.391014] env[61439]: DEBUG oslo.service.loopingcall [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1479.391224] env[61439]: DEBUG nova.compute.manager [-] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1479.391319] env[61439]: DEBUG nova.network.neutron [-] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1479.411531] env[61439]: DEBUG nova.network.neutron [-] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1479.418680] env[61439]: INFO nova.compute.manager [-] [instance: 5403acfb-04d2-4081-80c7-23d662410a20] Took 0.03 seconds to deallocate network for instance. [ 1479.500123] env[61439]: DEBUG oslo_concurrency.lockutils [None req-928d894d-cf38-44e9-94bb-9b4493d0e9b0 tempest-InstanceActionsNegativeTestJSON-667127095 tempest-InstanceActionsNegativeTestJSON-667127095-project-member] Lock "5403acfb-04d2-4081-80c7-23d662410a20" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.148s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.202704] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1496.203052] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Cleaning up deleted instances with incomplete migration {{(pid=61439) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1496.231552] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1500.239534] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1500.251095] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1500.251319] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.251484] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1500.251640] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1500.252717] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aafad2a2-cb5c-4558-917f-c9af082b7073 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.262112] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3f0d2b4-ccdf-40e6-b0bf-5b5b25538a53 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.275869] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65c834df-f447-4a3d-a264-80eb61aa0d91 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.282116] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38b91e49-b407-4a5f-9a84-caf04aef1720 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.311266] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181557MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1500.311411] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1500.311588] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.360305] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance ba1bd6f7-6f05-4a09-a35c-6493b64feb9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1500.360469] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1500.360623] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4abe5722-e83b-4c40-9b82-ca84545496c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1500.360826] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1500.360971] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1500.406442] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc05d0c4-f16b-4afb-829b-aa353ac3650e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.413752] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca91b0da-4caf-49e8-bc6e-c0fecfb7bd54 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.442632] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff08f98d-7dd8-4515-a736-ee54df126b64 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.449196] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06fe3d2f-3392-40bd-affa-13cdd73f90a3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1500.462853] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1500.469713] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1500.481978] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1500.482346] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1501.440807] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1501.440807] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1501.440807] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1501.440807] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1502.202822] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1503.202053] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1504.201792] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1504.201792] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1504.201964] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1504.213713] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1504.214026] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1504.214026] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1504.214159] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1504.214572] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1504.214750] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1504.214928] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1504.215068] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Cleaning up deleted instances {{(pid=61439) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1504.222731] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] There are 0 instances to clean {{(pid=61439) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1527.537682] env[61439]: WARNING oslo_vmware.rw_handles [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1527.537682] env[61439]: ERROR oslo_vmware.rw_handles [ 1527.538427] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1527.540191] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1527.540447] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Copying Virtual Disk [datastore2] vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/24afee06-a89b-486e-92b9-2f0a2e3c5513/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1527.540728] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5e56d624-b2ae-435a-b002-11cad8ecf5d1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1527.547900] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1527.547900] env[61439]: value = "task-987764" [ 1527.547900] env[61439]: _type = "Task" [ 1527.547900] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1527.556378] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987764, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1528.058280] env[61439]: DEBUG oslo_vmware.exceptions [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1528.058563] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1528.059125] env[61439]: ERROR nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1528.059125] env[61439]: Faults: ['InvalidArgument'] [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Traceback (most recent call last): [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] yield resources [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] self.driver.spawn(context, instance, image_meta, [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] self._fetch_image_if_missing(context, vi) [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] image_cache(vi, tmp_image_ds_loc) [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] vm_util.copy_virtual_disk( [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] session._wait_for_task(vmdk_copy_task) [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] return self.wait_for_task(task_ref) [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] return evt.wait() [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] result = hub.switch() [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] return self.greenlet.switch() [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] self.f(*self.args, **self.kw) [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] raise exceptions.translate_fault(task_info.error) [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Faults: ['InvalidArgument'] [ 1528.059125] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] [ 1528.060108] env[61439]: INFO nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Terminating instance [ 1528.060965] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1528.061191] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1528.061428] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a8802dfc-ebeb-4eeb-89b9-1b5db31b9f74 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.063836] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1528.064038] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1528.064752] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14205bf5-027e-437b-9334-ba2128713b54 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.072143] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1528.073139] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0559b1d3-b471-4bc6-8c61-84f4cbb5412e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.074525] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1528.074698] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1528.075358] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-62acd836-7ee0-4a8c-b624-920a853c8508 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.080363] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for the task: (returnval){ [ 1528.080363] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5286486c-ad14-71d2-31fa-f2dc3d9dcbd6" [ 1528.080363] env[61439]: _type = "Task" [ 1528.080363] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1528.087097] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5286486c-ad14-71d2-31fa-f2dc3d9dcbd6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1528.144022] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1528.144249] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1528.144422] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleting the datastore file [datastore2] ba1bd6f7-6f05-4a09-a35c-6493b64feb9d {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1528.144701] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7dd17add-269e-4b99-8d82-7ead33d7704e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.150837] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1528.150837] env[61439]: value = "task-987766" [ 1528.150837] env[61439]: _type = "Task" [ 1528.150837] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1528.158296] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987766, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1528.590911] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1528.591260] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Creating directory with path [datastore2] vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1528.591399] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b3070568-10de-4066-a4fc-50ef78ba9ac7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.601816] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Created directory with path [datastore2] vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1528.601986] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Fetch image to [datastore2] vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1528.602177] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1528.602940] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-303c047d-adf0-4972-b1a4-b3790185bcb9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.609198] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77f0ffe8-66ee-400a-ad67-be7f395d1b4f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.617917] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-264d1f20-1b14-448e-a86c-586a233a6bdb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.648619] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ac1120d-247e-4515-8b66-7cabc41b51b0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.656326] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ec5345dd-b870-4bb6-a82b-cb29fa79b690 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.660497] env[61439]: DEBUG oslo_vmware.api [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987766, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071777} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1528.661015] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1528.661230] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1528.661428] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1528.661649] env[61439]: INFO nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1528.663689] env[61439]: DEBUG nova.compute.claims [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1528.663875] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1528.664099] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1528.678771] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1528.728909] env[61439]: DEBUG oslo_vmware.rw_handles [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1528.788077] env[61439]: DEBUG oslo_vmware.rw_handles [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1528.788271] env[61439]: DEBUG oslo_vmware.rw_handles [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1528.806861] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf02ac7b-0672-40a7-b41b-a4786fea7632 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.813959] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19334348-ec02-41f6-a1d7-5e4073067114 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.843594] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd09713a-a4d3-4522-b8fe-49ae9121db64 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.850020] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea59b94b-2d01-48aa-8cd3-0b8fe2159e38 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.862283] env[61439]: DEBUG nova.compute.provider_tree [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1528.870457] env[61439]: DEBUG nova.scheduler.client.report [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1528.882747] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.219s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1528.883427] env[61439]: ERROR nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1528.883427] env[61439]: Faults: ['InvalidArgument'] [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Traceback (most recent call last): [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] self.driver.spawn(context, instance, image_meta, [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] self._fetch_image_if_missing(context, vi) [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] image_cache(vi, tmp_image_ds_loc) [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] vm_util.copy_virtual_disk( [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] session._wait_for_task(vmdk_copy_task) [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] return self.wait_for_task(task_ref) [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] return evt.wait() [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] result = hub.switch() [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] return self.greenlet.switch() [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] self.f(*self.args, **self.kw) [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] raise exceptions.translate_fault(task_info.error) [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Faults: ['InvalidArgument'] [ 1528.883427] env[61439]: ERROR nova.compute.manager [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] [ 1528.884419] env[61439]: DEBUG nova.compute.utils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1528.885939] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Build of instance ba1bd6f7-6f05-4a09-a35c-6493b64feb9d was re-scheduled: A specified parameter was not correct: fileType [ 1528.885939] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1528.886328] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1528.886522] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1528.886713] env[61439]: DEBUG nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1528.886879] env[61439]: DEBUG nova.network.neutron [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1529.159232] env[61439]: DEBUG nova.network.neutron [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1529.172654] env[61439]: INFO nova.compute.manager [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Took 0.29 seconds to deallocate network for instance. [ 1529.261661] env[61439]: INFO nova.scheduler.client.report [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted allocations for instance ba1bd6f7-6f05-4a09-a35c-6493b64feb9d [ 1529.281789] env[61439]: DEBUG oslo_concurrency.lockutils [None req-0440fba7-33d5-4895-8b0f-4c0c424b327d tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 265.021s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1529.282062] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 69.695s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1529.282292] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1529.282677] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1529.282783] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1529.284584] env[61439]: INFO nova.compute.manager [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Terminating instance [ 1529.286279] env[61439]: DEBUG nova.compute.manager [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1529.286472] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1529.286938] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-87fac48c-80e0-4a1a-ac77-9996e9693007 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1529.298036] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab3dff7c-1f6a-405d-882a-d61569f32edb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1529.320263] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ba1bd6f7-6f05-4a09-a35c-6493b64feb9d could not be found. [ 1529.320470] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1529.320685] env[61439]: INFO nova.compute.manager [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Took 0.03 seconds to destroy the instance on the hypervisor. [ 1529.320937] env[61439]: DEBUG oslo.service.loopingcall [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1529.321161] env[61439]: DEBUG nova.compute.manager [-] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1529.321259] env[61439]: DEBUG nova.network.neutron [-] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1529.344165] env[61439]: DEBUG nova.network.neutron [-] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1529.352061] env[61439]: INFO nova.compute.manager [-] [instance: ba1bd6f7-6f05-4a09-a35c-6493b64feb9d] Took 0.03 seconds to deallocate network for instance. [ 1529.434109] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7ef041dc-ea3e-4a67-8b3b-030a1eaa4fc9 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "ba1bd6f7-6f05-4a09-a35c-6493b64feb9d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.152s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1530.641617] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "7c22930b-a1b7-46b7-82ae-de363f85b393" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1530.641896] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "7c22930b-a1b7-46b7-82ae-de363f85b393" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1530.651710] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1530.701531] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1530.701824] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1530.703781] env[61439]: INFO nova.compute.claims [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1530.799339] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96e9bc1a-0863-4bdd-a559-78a41e0c0fb9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1530.806867] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a832f6-139e-4f60-be9b-592758802fe1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1530.836933] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa493e9b-e285-4bd4-a8dc-79d61996ee85 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1530.843384] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f65527c-704f-471f-bb99-c822b9c65425 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1530.855821] env[61439]: DEBUG nova.compute.provider_tree [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1530.864124] env[61439]: DEBUG nova.scheduler.client.report [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1530.875935] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1530.876393] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1530.906165] env[61439]: DEBUG nova.compute.utils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1530.907251] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1530.907420] env[61439]: DEBUG nova.network.neutron [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1530.915312] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1530.969718] env[61439]: DEBUG nova.policy [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf4545757716483485ca9b60bd689a1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e839303682f748a4b5a42c8a9273e388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1530.972535] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1530.996668] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1530.996936] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1530.997109] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1530.997294] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1530.997440] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1530.997586] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1530.997797] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1530.997957] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1530.998139] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1530.998303] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1530.998473] env[61439]: DEBUG nova.virt.hardware [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1530.999316] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4ea5bdc-8bb8-4d8d-bd6e-f7323d7c37cb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1531.007128] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ee21e7c-cf1b-4109-a29c-da8688268760 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1531.262986] env[61439]: DEBUG nova.network.neutron [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Successfully created port: 730ceafb-2bdb-42b0-8223-e1be193f7efd {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1531.955864] env[61439]: DEBUG nova.compute.manager [req-07a709f4-fd8d-44f2-9ae7-ef27317cf22a req-81aa9101-e066-4043-b162-c6a589884901 service nova] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Received event network-vif-plugged-730ceafb-2bdb-42b0-8223-e1be193f7efd {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1531.955864] env[61439]: DEBUG oslo_concurrency.lockutils [req-07a709f4-fd8d-44f2-9ae7-ef27317cf22a req-81aa9101-e066-4043-b162-c6a589884901 service nova] Acquiring lock "7c22930b-a1b7-46b7-82ae-de363f85b393-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1531.955864] env[61439]: DEBUG oslo_concurrency.lockutils [req-07a709f4-fd8d-44f2-9ae7-ef27317cf22a req-81aa9101-e066-4043-b162-c6a589884901 service nova] Lock "7c22930b-a1b7-46b7-82ae-de363f85b393-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1531.955864] env[61439]: DEBUG oslo_concurrency.lockutils [req-07a709f4-fd8d-44f2-9ae7-ef27317cf22a req-81aa9101-e066-4043-b162-c6a589884901 service nova] Lock "7c22930b-a1b7-46b7-82ae-de363f85b393-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1531.955864] env[61439]: DEBUG nova.compute.manager [req-07a709f4-fd8d-44f2-9ae7-ef27317cf22a req-81aa9101-e066-4043-b162-c6a589884901 service nova] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] No waiting events found dispatching network-vif-plugged-730ceafb-2bdb-42b0-8223-e1be193f7efd {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1531.955864] env[61439]: WARNING nova.compute.manager [req-07a709f4-fd8d-44f2-9ae7-ef27317cf22a req-81aa9101-e066-4043-b162-c6a589884901 service nova] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Received unexpected event network-vif-plugged-730ceafb-2bdb-42b0-8223-e1be193f7efd for instance with vm_state building and task_state spawning. [ 1532.036198] env[61439]: DEBUG nova.network.neutron [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Successfully updated port: 730ceafb-2bdb-42b0-8223-e1be193f7efd {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1532.049824] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "refresh_cache-7c22930b-a1b7-46b7-82ae-de363f85b393" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1532.051445] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "refresh_cache-7c22930b-a1b7-46b7-82ae-de363f85b393" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1532.051445] env[61439]: DEBUG nova.network.neutron [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1532.088449] env[61439]: DEBUG nova.network.neutron [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1532.240361] env[61439]: DEBUG nova.network.neutron [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Updating instance_info_cache with network_info: [{"id": "730ceafb-2bdb-42b0-8223-e1be193f7efd", "address": "fa:16:3e:55:78:dc", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap730ceafb-2b", "ovs_interfaceid": "730ceafb-2bdb-42b0-8223-e1be193f7efd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1532.250881] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "refresh_cache-7c22930b-a1b7-46b7-82ae-de363f85b393" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1532.251163] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Instance network_info: |[{"id": "730ceafb-2bdb-42b0-8223-e1be193f7efd", "address": "fa:16:3e:55:78:dc", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap730ceafb-2b", "ovs_interfaceid": "730ceafb-2bdb-42b0-8223-e1be193f7efd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1532.251535] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:55:78:dc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '779b8e65-8b9e-427e-af08-910febd65bfa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '730ceafb-2bdb-42b0-8223-e1be193f7efd', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1532.259068] env[61439]: DEBUG oslo.service.loopingcall [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1532.259482] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1532.259699] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-739893f5-d626-468f-8d8d-03c54784d8ba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1532.280814] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1532.280814] env[61439]: value = "task-987767" [ 1532.280814] env[61439]: _type = "Task" [ 1532.280814] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1532.290591] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987767, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1532.790996] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987767, 'name': CreateVM_Task, 'duration_secs': 0.287194} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1532.791200] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1532.798381] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1532.798554] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1532.798871] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1532.799129] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ec2fba50-1827-4ff7-bc4e-31727c773b30 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1532.803300] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1532.803300] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5284206a-f11b-59a5-925e-d47462386a27" [ 1532.803300] env[61439]: _type = "Task" [ 1532.803300] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1532.810574] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5284206a-f11b-59a5-925e-d47462386a27, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1533.314544] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1533.314813] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1533.314996] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1533.980652] env[61439]: DEBUG nova.compute.manager [req-6756eb6c-7956-4560-a810-b9a855128501 req-d540ac17-9c7c-4124-afde-dce0be774892 service nova] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Received event network-changed-730ceafb-2bdb-42b0-8223-e1be193f7efd {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1533.980847] env[61439]: DEBUG nova.compute.manager [req-6756eb6c-7956-4560-a810-b9a855128501 req-d540ac17-9c7c-4124-afde-dce0be774892 service nova] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Refreshing instance network info cache due to event network-changed-730ceafb-2bdb-42b0-8223-e1be193f7efd. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1533.981079] env[61439]: DEBUG oslo_concurrency.lockutils [req-6756eb6c-7956-4560-a810-b9a855128501 req-d540ac17-9c7c-4124-afde-dce0be774892 service nova] Acquiring lock "refresh_cache-7c22930b-a1b7-46b7-82ae-de363f85b393" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1533.981225] env[61439]: DEBUG oslo_concurrency.lockutils [req-6756eb6c-7956-4560-a810-b9a855128501 req-d540ac17-9c7c-4124-afde-dce0be774892 service nova] Acquired lock "refresh_cache-7c22930b-a1b7-46b7-82ae-de363f85b393" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1533.981386] env[61439]: DEBUG nova.network.neutron [req-6756eb6c-7956-4560-a810-b9a855128501 req-d540ac17-9c7c-4124-afde-dce0be774892 service nova] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Refreshing network info cache for port 730ceafb-2bdb-42b0-8223-e1be193f7efd {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1534.297561] env[61439]: DEBUG nova.network.neutron [req-6756eb6c-7956-4560-a810-b9a855128501 req-d540ac17-9c7c-4124-afde-dce0be774892 service nova] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Updated VIF entry in instance network info cache for port 730ceafb-2bdb-42b0-8223-e1be193f7efd. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1534.297927] env[61439]: DEBUG nova.network.neutron [req-6756eb6c-7956-4560-a810-b9a855128501 req-d540ac17-9c7c-4124-afde-dce0be774892 service nova] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Updating instance_info_cache with network_info: [{"id": "730ceafb-2bdb-42b0-8223-e1be193f7efd", "address": "fa:16:3e:55:78:dc", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap730ceafb-2b", "ovs_interfaceid": "730ceafb-2bdb-42b0-8223-e1be193f7efd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1534.306685] env[61439]: DEBUG oslo_concurrency.lockutils [req-6756eb6c-7956-4560-a810-b9a855128501 req-d540ac17-9c7c-4124-afde-dce0be774892 service nova] Releasing lock "refresh_cache-7c22930b-a1b7-46b7-82ae-de363f85b393" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1560.211053] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1560.222387] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1560.222612] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1560.222791] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1560.222956] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1560.224134] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93288d8f-dfc5-4cf8-a152-e7d34c7e12d8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.232748] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-258ab2e6-3a5c-4d30-9efe-aec04a55197d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.247634] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-401385f4-4e7d-478c-a257-e9d9fccc1f34 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.253729] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9ac62ee-a312-4edd-abbe-04df855c7462 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.282032] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181579MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1560.282189] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1560.282379] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1560.368330] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1560.368506] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4abe5722-e83b-4c40-9b82-ca84545496c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1560.368638] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7c22930b-a1b7-46b7-82ae-de363f85b393 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1560.368832] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1560.368975] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1560.384542] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing inventories for resource provider b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1560.397251] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Updating ProviderTree inventory for provider b35c9fce-988b-4acc-b175-83b202107c41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1560.397429] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Updating inventory in ProviderTree for provider b35c9fce-988b-4acc-b175-83b202107c41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1560.407294] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing aggregate associations for resource provider b35c9fce-988b-4acc-b175-83b202107c41, aggregates: None {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1560.424788] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing trait associations for resource provider b35c9fce-988b-4acc-b175-83b202107c41, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1560.466270] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca72947c-c328-412e-9e6c-848fb2da13a4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.473742] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb5f8f6-7b12-455a-8199-2bab85d18840 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.503232] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51282318-8e34-4b1d-a619-cacffaf55d7a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.509587] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a7a4d9e-35b9-4ae0-a181-1eee86570c05 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.521965] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1560.531570] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1560.544571] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1560.544755] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.262s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1562.536313] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1563.197382] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1563.201033] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1563.201240] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1563.201391] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1564.203127] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1565.202217] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1565.202466] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1565.202542] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1565.215453] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1565.215826] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1565.215826] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1565.215934] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1565.216335] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1565.216519] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1567.211276] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1577.805901] env[61439]: WARNING oslo_vmware.rw_handles [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1577.805901] env[61439]: ERROR oslo_vmware.rw_handles [ 1577.806786] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1577.808541] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1577.808784] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Copying Virtual Disk [datastore2] vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/32e00254-5ab8-413d-bac8-9b3f3ef11917/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1577.809097] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2be9f6c2-54d4-435f-8287-529c838589e2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.818080] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for the task: (returnval){ [ 1577.818080] env[61439]: value = "task-987768" [ 1577.818080] env[61439]: _type = "Task" [ 1577.818080] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1577.826130] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': task-987768, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1578.328331] env[61439]: DEBUG oslo_vmware.exceptions [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1578.328626] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1578.329205] env[61439]: ERROR nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1578.329205] env[61439]: Faults: ['InvalidArgument'] [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Traceback (most recent call last): [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] yield resources [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] self.driver.spawn(context, instance, image_meta, [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] self._fetch_image_if_missing(context, vi) [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] image_cache(vi, tmp_image_ds_loc) [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] vm_util.copy_virtual_disk( [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] session._wait_for_task(vmdk_copy_task) [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] return self.wait_for_task(task_ref) [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] return evt.wait() [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] result = hub.switch() [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] return self.greenlet.switch() [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] self.f(*self.args, **self.kw) [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] raise exceptions.translate_fault(task_info.error) [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Faults: ['InvalidArgument'] [ 1578.329205] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] [ 1578.330384] env[61439]: INFO nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Terminating instance [ 1578.331085] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1578.331294] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1578.331526] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-63e497e3-d0c8-4691-a0ab-9bc543323ae8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.333687] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1578.333887] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1578.334598] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97434c9f-dea0-4abb-849e-9aaf7aa2a0a4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.341351] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1578.341556] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-67155d4e-d2bf-4bc7-820e-2d95de028656 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.344024] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1578.344133] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1578.345027] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8fd1800f-04ab-4f3e-9c1c-fd9620c7cfa6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.349924] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for the task: (returnval){ [ 1578.349924] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52f9f22f-264b-2494-1d0f-32b1b2a02017" [ 1578.349924] env[61439]: _type = "Task" [ 1578.349924] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1578.357577] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52f9f22f-264b-2494-1d0f-32b1b2a02017, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1578.409596] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1578.409822] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1578.409985] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Deleting the datastore file [datastore2] 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1578.410266] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-18cb23bf-8624-4b03-9a60-9acac9f2a6c5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.416385] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for the task: (returnval){ [ 1578.416385] env[61439]: value = "task-987770" [ 1578.416385] env[61439]: _type = "Task" [ 1578.416385] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1578.423722] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': task-987770, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1578.860485] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1578.860911] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Creating directory with path [datastore2] vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1578.860960] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-88f45018-9480-4621-9114-e9f9037fb35f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.873055] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Created directory with path [datastore2] vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1578.873236] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Fetch image to [datastore2] vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1578.873404] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1578.874081] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4323af36-57aa-48eb-ae70-80e4067ae0fb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.880294] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d2052a0-8822-4bc5-ab91-5013e0b75f66 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.889102] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57c3a057-d697-493a-9c97-797960e1f865 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.921391] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4110e651-1740-45f7-bd13-678118a3f514 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.929568] env[61439]: DEBUG oslo_vmware.api [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': task-987770, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074365} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1578.929774] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-77837b0b-3379-45ff-8638-59f97d41da78 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.931365] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1578.931545] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1578.931716] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1578.931891] env[61439]: INFO nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1578.933966] env[61439]: DEBUG nova.compute.claims [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1578.934154] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1578.934372] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1578.954041] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1579.032916] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a66cffc-34e3-4fba-8515-5b19b6922adc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1579.040188] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4cb996c-3e96-4c8b-856c-f1793310c39d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1579.077023] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa6f03f-b2c9-44be-bef4-f073580ce3a4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1579.084217] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35a17b31-bd81-40bc-b187-4015f7a6a3b2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1579.098140] env[61439]: DEBUG nova.compute.provider_tree [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1579.107494] env[61439]: DEBUG nova.scheduler.client.report [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1579.110800] env[61439]: DEBUG oslo_vmware.rw_handles [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1579.166872] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.232s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1579.167419] env[61439]: ERROR nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1579.167419] env[61439]: Faults: ['InvalidArgument'] [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Traceback (most recent call last): [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] self.driver.spawn(context, instance, image_meta, [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] self._fetch_image_if_missing(context, vi) [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] image_cache(vi, tmp_image_ds_loc) [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] vm_util.copy_virtual_disk( [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] session._wait_for_task(vmdk_copy_task) [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] return self.wait_for_task(task_ref) [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] return evt.wait() [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] result = hub.switch() [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] return self.greenlet.switch() [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] self.f(*self.args, **self.kw) [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] raise exceptions.translate_fault(task_info.error) [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Faults: ['InvalidArgument'] [ 1579.167419] env[61439]: ERROR nova.compute.manager [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] [ 1579.168217] env[61439]: DEBUG nova.compute.utils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1579.170490] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Build of instance 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a was re-scheduled: A specified parameter was not correct: fileType [ 1579.170490] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1579.170925] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1579.171134] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1579.171315] env[61439]: DEBUG nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1579.171478] env[61439]: DEBUG nova.network.neutron [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1579.173480] env[61439]: DEBUG oslo_vmware.rw_handles [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1579.173667] env[61439]: DEBUG oslo_vmware.rw_handles [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1579.432024] env[61439]: DEBUG nova.network.neutron [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1579.442688] env[61439]: INFO nova.compute.manager [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a] Took 0.27 seconds to deallocate network for instance. [ 1579.529512] env[61439]: INFO nova.scheduler.client.report [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Deleted allocations for instance 5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a [ 1579.549307] env[61439]: DEBUG oslo_concurrency.lockutils [None req-96438c0a-d283-4012-8477-59279c736c1f tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "5fb1f6f5-7c20-4b89-ae2f-0ab2a416b53a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 313.949s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1621.201815] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1621.213082] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1621.213307] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1621.213478] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1621.213638] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1621.215186] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82cc2f52-11b9-435f-b28a-a575a8432cd9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.223595] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aef04d81-5ebe-4676-95d5-083044342b4a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.237178] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c23ab06-c50f-45f5-ad66-534106363d66 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.243319] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b72ed90f-10ac-4ed2-a0e1-e595e0c17fdd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.272561] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181556MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1621.272730] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1621.272910] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1621.314527] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4abe5722-e83b-4c40-9b82-ca84545496c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1621.314527] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 7c22930b-a1b7-46b7-82ae-de363f85b393 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1621.314527] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1621.314527] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1621.352223] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69e40a00-33f6-4f35-a038-fbbf528d44be {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.359542] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ee784fa-306c-4635-abd5-ad458fe5acd9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.388564] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6afa9f5c-141e-458e-90e3-93c769a3f66f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.395424] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0a8955c-be76-48d1-9caa-12274b50f3f1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1621.409520] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1621.418294] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1621.430760] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1621.430944] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1623.431538] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1623.431826] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1623.431967] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1624.202916] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1625.198260] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1625.200893] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1625.201111] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1626.202435] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1627.203324] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1627.203696] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1627.203696] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1627.214873] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1627.215037] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1627.215187] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1627.989538] env[61439]: WARNING oslo_vmware.rw_handles [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1627.989538] env[61439]: ERROR oslo_vmware.rw_handles [ 1627.990037] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1627.992179] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1627.992438] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Copying Virtual Disk [datastore2] vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/6d4cca35-c0b9-478f-a7d7-0c73ce2cd24b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1627.992780] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-18660f9b-cf93-485c-8ddd-a7205f4e82a9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.003221] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for the task: (returnval){ [ 1628.003221] env[61439]: value = "task-987771" [ 1628.003221] env[61439]: _type = "Task" [ 1628.003221] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1628.010798] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': task-987771, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1628.515313] env[61439]: DEBUG oslo_vmware.exceptions [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1628.515666] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1628.516074] env[61439]: ERROR nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1628.516074] env[61439]: Faults: ['InvalidArgument'] [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Traceback (most recent call last): [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] yield resources [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] self.driver.spawn(context, instance, image_meta, [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] self._fetch_image_if_missing(context, vi) [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] image_cache(vi, tmp_image_ds_loc) [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] vm_util.copy_virtual_disk( [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] session._wait_for_task(vmdk_copy_task) [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] return self.wait_for_task(task_ref) [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] return evt.wait() [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] result = hub.switch() [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] return self.greenlet.switch() [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] self.f(*self.args, **self.kw) [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] raise exceptions.translate_fault(task_info.error) [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Faults: ['InvalidArgument'] [ 1628.516074] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] [ 1628.517100] env[61439]: INFO nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Terminating instance [ 1628.517853] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1628.518082] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1628.518316] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ff8d2b60-40db-4f3e-aff1-ea06ab126876 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.520395] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1628.520583] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1628.521280] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b557c616-9afd-44cb-bd30-5b21bab2800c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.527905] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1628.528789] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6ad74606-30e3-4d9a-bac9-1514986fa703 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.530088] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1628.530263] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1628.530911] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1e570a06-5dfe-4e86-b8ba-f21e9725c471 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.535591] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1628.535591] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52abd950-6a54-50df-d4ab-4db92b87fed0" [ 1628.535591] env[61439]: _type = "Task" [ 1628.535591] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1628.547425] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52abd950-6a54-50df-d4ab-4db92b87fed0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1628.606242] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1628.606475] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1628.606650] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Deleting the datastore file [datastore2] 4abe5722-e83b-4c40-9b82-ca84545496c8 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1628.606910] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2ccd0a38-6a63-479a-ab66-896e0f2e3e5b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.613794] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for the task: (returnval){ [ 1628.613794] env[61439]: value = "task-987773" [ 1628.613794] env[61439]: _type = "Task" [ 1628.613794] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1628.621120] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': task-987773, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1629.046258] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1629.046579] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1629.046747] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-091ce863-0aad-4ba5-bd56-90c371d4468b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.057654] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1629.057836] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Fetch image to [datastore2] vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1629.058017] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1629.058751] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c61044d-217f-465f-9900-b168901a4e97 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.065020] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01d79c7f-1c0d-489f-a041-bdfbb31e5f5f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.073791] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e87624-442d-4854-b1be-b464f506ff9a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.104643] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b43319ec-c822-4e87-9008-92d82f087f32 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.109766] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-54b8eb75-f591-48e3-b0bc-5482d7b43758 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.121546] env[61439]: DEBUG oslo_vmware.api [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Task: {'id': task-987773, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067921} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1629.121767] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1629.121949] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1629.122135] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1629.122311] env[61439]: INFO nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1629.124341] env[61439]: DEBUG nova.compute.claims [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1629.124510] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1629.124726] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1629.129886] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1629.186628] env[61439]: DEBUG oslo_vmware.rw_handles [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1629.245599] env[61439]: DEBUG oslo_vmware.rw_handles [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1629.245899] env[61439]: DEBUG oslo_vmware.rw_handles [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1629.253398] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69fc1144-82b3-41b3-b131-0b18472a567f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.261181] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4439c752-36a2-4e07-a904-61a59fb8604a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.290764] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe5216f3-bc65-4c25-98d4-bf9052545dba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.297567] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9450c1f2-2d00-4f5c-b353-d999a50376bb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.310191] env[61439]: DEBUG nova.compute.provider_tree [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1629.317938] env[61439]: DEBUG nova.scheduler.client.report [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1629.331050] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.206s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1629.331394] env[61439]: ERROR nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1629.331394] env[61439]: Faults: ['InvalidArgument'] [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Traceback (most recent call last): [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] self.driver.spawn(context, instance, image_meta, [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] self._fetch_image_if_missing(context, vi) [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] image_cache(vi, tmp_image_ds_loc) [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] vm_util.copy_virtual_disk( [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] session._wait_for_task(vmdk_copy_task) [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] return self.wait_for_task(task_ref) [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] return evt.wait() [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] result = hub.switch() [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] return self.greenlet.switch() [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] self.f(*self.args, **self.kw) [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] raise exceptions.translate_fault(task_info.error) [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Faults: ['InvalidArgument'] [ 1629.331394] env[61439]: ERROR nova.compute.manager [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] [ 1629.332820] env[61439]: DEBUG nova.compute.utils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1629.333586] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Build of instance 4abe5722-e83b-4c40-9b82-ca84545496c8 was re-scheduled: A specified parameter was not correct: fileType [ 1629.333586] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1629.334116] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1629.334402] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1629.334687] env[61439]: DEBUG nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1629.334959] env[61439]: DEBUG nova.network.neutron [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1629.650679] env[61439]: DEBUG nova.network.neutron [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1629.659835] env[61439]: INFO nova.compute.manager [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Took 0.32 seconds to deallocate network for instance. [ 1629.751762] env[61439]: INFO nova.scheduler.client.report [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Deleted allocations for instance 4abe5722-e83b-4c40-9b82-ca84545496c8 [ 1629.771837] env[61439]: DEBUG oslo_concurrency.lockutils [None req-eec01ff8-31b7-4d0a-a7d3-549c6cf24ebf tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "4abe5722-e83b-4c40-9b82-ca84545496c8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 363.612s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1629.771978] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "4abe5722-e83b-4c40-9b82-ca84545496c8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 168.039s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1629.772101] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Acquiring lock "4abe5722-e83b-4c40-9b82-ca84545496c8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1629.772315] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "4abe5722-e83b-4c40-9b82-ca84545496c8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1629.772483] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "4abe5722-e83b-4c40-9b82-ca84545496c8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1629.774458] env[61439]: INFO nova.compute.manager [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Terminating instance [ 1629.776258] env[61439]: DEBUG nova.compute.manager [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1629.776450] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1629.776913] env[61439]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-dce38e18-d100-4dd0-b176-2f197f30473b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.786192] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2c8abaf-c404-4a3b-b323-c2d72e0fca4b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.809543] env[61439]: WARNING nova.virt.vmwareapi.vmops [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4abe5722-e83b-4c40-9b82-ca84545496c8 could not be found. [ 1629.809775] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1629.809957] env[61439]: INFO nova.compute.manager [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Took 0.03 seconds to destroy the instance on the hypervisor. [ 1629.810227] env[61439]: DEBUG oslo.service.loopingcall [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1629.810439] env[61439]: DEBUG nova.compute.manager [-] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1629.810532] env[61439]: DEBUG nova.network.neutron [-] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1629.832805] env[61439]: DEBUG nova.network.neutron [-] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1629.840371] env[61439]: INFO nova.compute.manager [-] [instance: 4abe5722-e83b-4c40-9b82-ca84545496c8] Took 0.03 seconds to deallocate network for instance. [ 1629.941567] env[61439]: DEBUG oslo_concurrency.lockutils [None req-2cb4e165-8200-426e-abac-27cf6afc7014 tempest-ServerRescueNegativeTestJSON-1975896800 tempest-ServerRescueNegativeTestJSON-1975896800-project-member] Lock "4abe5722-e83b-4c40-9b82-ca84545496c8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1678.900280] env[61439]: WARNING oslo_vmware.rw_handles [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1678.900280] env[61439]: ERROR oslo_vmware.rw_handles [ 1678.900991] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1678.902766] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1678.903030] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Copying Virtual Disk [datastore2] vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/43d871d7-52ee-4f2b-a431-694128a3b815/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1678.903319] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-764c5db5-a1b2-4516-8125-05b115de35c1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.910985] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1678.910985] env[61439]: value = "task-987774" [ 1678.910985] env[61439]: _type = "Task" [ 1678.910985] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1678.920664] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987774, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1679.421321] env[61439]: DEBUG oslo_vmware.exceptions [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1679.421582] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1679.422131] env[61439]: ERROR nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1679.422131] env[61439]: Faults: ['InvalidArgument'] [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Traceback (most recent call last): [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] yield resources [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] self.driver.spawn(context, instance, image_meta, [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] self._fetch_image_if_missing(context, vi) [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] image_cache(vi, tmp_image_ds_loc) [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] vm_util.copy_virtual_disk( [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] session._wait_for_task(vmdk_copy_task) [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] return self.wait_for_task(task_ref) [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] return evt.wait() [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] result = hub.switch() [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] return self.greenlet.switch() [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] self.f(*self.args, **self.kw) [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] raise exceptions.translate_fault(task_info.error) [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Faults: ['InvalidArgument'] [ 1679.422131] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] [ 1679.423179] env[61439]: INFO nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Terminating instance [ 1679.425283] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1679.425473] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1679.426211] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bd3567d-124e-46e0-9ec8-b7532f06211c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.432996] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1679.433231] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0d8caccd-4c70-4e19-8ac4-6df384505096 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.498072] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1679.498285] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1679.498460] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleting the datastore file [datastore2] 7c22930b-a1b7-46b7-82ae-de363f85b393 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1679.498745] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0eedd26c-6663-4107-bd45-8f55bb5af232 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.505259] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1679.505259] env[61439]: value = "task-987776" [ 1679.505259] env[61439]: _type = "Task" [ 1679.505259] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1679.512303] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987776, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1680.015126] env[61439]: DEBUG oslo_vmware.api [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987776, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067371} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1680.015544] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1680.015612] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1680.015751] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1680.015983] env[61439]: INFO nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1680.018141] env[61439]: DEBUG nova.compute.claims [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1680.018315] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1680.018530] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1680.097250] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-339dfd2a-c546-4f1c-80ad-57f8c3fda8ac {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.104921] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b0f3239-3079-4143-a3ad-df303682d3d6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.134387] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e943d34-05ca-4389-aa96-ab949bb967ec {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.141310] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8afe331e-2991-4e7d-b75b-d8a647fa11ff {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.154186] env[61439]: DEBUG nova.compute.provider_tree [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1680.162576] env[61439]: DEBUG nova.scheduler.client.report [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1680.176320] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.158s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1680.176839] env[61439]: ERROR nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1680.176839] env[61439]: Faults: ['InvalidArgument'] [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Traceback (most recent call last): [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] self.driver.spawn(context, instance, image_meta, [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] self._fetch_image_if_missing(context, vi) [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] image_cache(vi, tmp_image_ds_loc) [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] vm_util.copy_virtual_disk( [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] session._wait_for_task(vmdk_copy_task) [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] return self.wait_for_task(task_ref) [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] return evt.wait() [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] result = hub.switch() [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] return self.greenlet.switch() [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] self.f(*self.args, **self.kw) [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] raise exceptions.translate_fault(task_info.error) [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Faults: ['InvalidArgument'] [ 1680.176839] env[61439]: ERROR nova.compute.manager [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] [ 1680.177870] env[61439]: DEBUG nova.compute.utils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1680.179814] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Build of instance 7c22930b-a1b7-46b7-82ae-de363f85b393 was re-scheduled: A specified parameter was not correct: fileType [ 1680.179814] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1680.180244] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1680.180421] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1680.180596] env[61439]: DEBUG nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1680.180762] env[61439]: DEBUG nova.network.neutron [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1680.462452] env[61439]: DEBUG nova.network.neutron [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1680.475760] env[61439]: INFO nova.compute.manager [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 7c22930b-a1b7-46b7-82ae-de363f85b393] Took 0.29 seconds to deallocate network for instance. [ 1680.569532] env[61439]: INFO nova.scheduler.client.report [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted allocations for instance 7c22930b-a1b7-46b7-82ae-de363f85b393 [ 1680.592458] env[61439]: DEBUG oslo_concurrency.lockutils [None req-8393179c-baf6-49ae-bb58-2638f5142f9a tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "7c22930b-a1b7-46b7-82ae-de363f85b393" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 149.950s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1682.252089] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "77bcc713-fa13-4c98-97b7-7ea1697d0426" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.252392] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "77bcc713-fa13-4c98-97b7-7ea1697d0426" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1682.261732] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1682.309342] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.309588] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1682.311172] env[61439]: INFO nova.compute.claims [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1682.380024] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d46310a4-b50c-450d-9a6b-fc20037c14a1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.388294] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea929130-4a30-436e-8043-fd61b74a56e7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.418670] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec44a0fc-fa72-4d34-ba64-aaf00273252d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.425830] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8db0b5a1-3fa1-4d6b-a896-07a11145d0ca {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.438604] env[61439]: DEBUG nova.compute.provider_tree [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1682.447362] env[61439]: DEBUG nova.scheduler.client.report [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1682.459675] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1682.460149] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1682.491139] env[61439]: DEBUG nova.compute.utils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1682.492476] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1682.492668] env[61439]: DEBUG nova.network.neutron [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1682.501581] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1682.547170] env[61439]: DEBUG nova.policy [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf4545757716483485ca9b60bd689a1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e839303682f748a4b5a42c8a9273e388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1682.561361] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1682.585026] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1682.585026] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1682.585229] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1682.585322] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1682.585475] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1682.585626] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1682.585832] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1682.586100] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1682.586302] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1682.586485] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1682.586666] env[61439]: DEBUG nova.virt.hardware [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1682.587522] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-481d92cb-9e0a-4d97-b683-b97112598561 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.595451] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efbfd222-71b4-4bc7-80da-c1f5328f279c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.841626] env[61439]: DEBUG nova.network.neutron [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Successfully created port: 60dd8270-5b09-4a25-9873-20c868b28318 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1683.201677] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1683.212744] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1683.212966] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1683.213148] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1683.213307] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1683.214388] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6428e24b-2f38-41ff-bdc4-3cf786144c55 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.222949] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fd72ed0-72a2-4e24-a03c-0f48e2eb9f44 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.236350] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01df7e90-49fd-4d55-8376-c001b37c46c7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.242383] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce785d46-44e7-48dd-9ecf-c0a2c9183cf6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.271675] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181516MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1683.271969] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1683.272041] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1683.317600] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 77bcc713-fa13-4c98-97b7-7ea1697d0426 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1683.317824] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1683.317966] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1683.346644] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4a30210-c550-47af-a09e-14ebf8d37a0c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.354393] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08f3fef5-16db-47bc-9aab-cd010998ffd5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.383934] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c96a330-abe3-4627-a8b2-de9a3cb61068 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.391156] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-671407fe-a2ad-4d97-bce8-0f9ccc9e20f8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.404445] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1683.412072] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1683.426054] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1683.426286] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1683.714734] env[61439]: DEBUG nova.compute.manager [req-cc838d67-b1a7-4e23-b83f-e09347fe84e8 req-67bd5816-81de-409f-979e-a358390531c0 service nova] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Received event network-vif-plugged-60dd8270-5b09-4a25-9873-20c868b28318 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1683.714979] env[61439]: DEBUG oslo_concurrency.lockutils [req-cc838d67-b1a7-4e23-b83f-e09347fe84e8 req-67bd5816-81de-409f-979e-a358390531c0 service nova] Acquiring lock "77bcc713-fa13-4c98-97b7-7ea1697d0426-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1683.715215] env[61439]: DEBUG oslo_concurrency.lockutils [req-cc838d67-b1a7-4e23-b83f-e09347fe84e8 req-67bd5816-81de-409f-979e-a358390531c0 service nova] Lock "77bcc713-fa13-4c98-97b7-7ea1697d0426-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1683.715383] env[61439]: DEBUG oslo_concurrency.lockutils [req-cc838d67-b1a7-4e23-b83f-e09347fe84e8 req-67bd5816-81de-409f-979e-a358390531c0 service nova] Lock "77bcc713-fa13-4c98-97b7-7ea1697d0426-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1683.715548] env[61439]: DEBUG nova.compute.manager [req-cc838d67-b1a7-4e23-b83f-e09347fe84e8 req-67bd5816-81de-409f-979e-a358390531c0 service nova] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] No waiting events found dispatching network-vif-plugged-60dd8270-5b09-4a25-9873-20c868b28318 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1683.715710] env[61439]: WARNING nova.compute.manager [req-cc838d67-b1a7-4e23-b83f-e09347fe84e8 req-67bd5816-81de-409f-979e-a358390531c0 service nova] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Received unexpected event network-vif-plugged-60dd8270-5b09-4a25-9873-20c868b28318 for instance with vm_state building and task_state spawning. [ 1683.729997] env[61439]: DEBUG nova.network.neutron [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Successfully updated port: 60dd8270-5b09-4a25-9873-20c868b28318 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1683.740297] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "refresh_cache-77bcc713-fa13-4c98-97b7-7ea1697d0426" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1683.740444] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "refresh_cache-77bcc713-fa13-4c98-97b7-7ea1697d0426" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1683.740615] env[61439]: DEBUG nova.network.neutron [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1683.780049] env[61439]: DEBUG nova.network.neutron [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1683.935497] env[61439]: DEBUG nova.network.neutron [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Updating instance_info_cache with network_info: [{"id": "60dd8270-5b09-4a25-9873-20c868b28318", "address": "fa:16:3e:44:36:fd", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap60dd8270-5b", "ovs_interfaceid": "60dd8270-5b09-4a25-9873-20c868b28318", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1683.948047] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "refresh_cache-77bcc713-fa13-4c98-97b7-7ea1697d0426" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1683.948322] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Instance network_info: |[{"id": "60dd8270-5b09-4a25-9873-20c868b28318", "address": "fa:16:3e:44:36:fd", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap60dd8270-5b", "ovs_interfaceid": "60dd8270-5b09-4a25-9873-20c868b28318", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1683.948709] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:44:36:fd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '779b8e65-8b9e-427e-af08-910febd65bfa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '60dd8270-5b09-4a25-9873-20c868b28318', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1683.956145] env[61439]: DEBUG oslo.service.loopingcall [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1683.956931] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1683.957172] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8aad1423-2e69-4e7a-9b68-7f5d3e4a313a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1683.977271] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1683.977271] env[61439]: value = "task-987777" [ 1683.977271] env[61439]: _type = "Task" [ 1683.977271] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1683.984604] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987777, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1684.426195] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1684.486800] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987777, 'name': CreateVM_Task, 'duration_secs': 0.300678} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1684.486940] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1684.487571] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1684.487738] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1684.488067] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1684.488319] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b112bf34-73eb-474d-adab-7b4b1d42b7ad {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1684.492749] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1684.492749] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5247e7a1-3054-d076-ecc7-abe22e20f273" [ 1684.492749] env[61439]: _type = "Task" [ 1684.492749] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1684.500312] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5247e7a1-3054-d076-ecc7-abe22e20f273, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1685.003402] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1685.003668] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1685.003933] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1685.004101] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1685.004289] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1685.004529] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-13778259-77a2-461e-895d-69d717428c57 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.021257] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1685.021435] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1685.022147] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf60e24f-a9bb-4a6b-bf74-196ee76234bb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.027183] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1685.027183] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]522945d9-4d86-d386-d886-696d4102aa0a" [ 1685.027183] env[61439]: _type = "Task" [ 1685.027183] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1685.034609] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]522945d9-4d86-d386-d886-696d4102aa0a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1685.201535] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1685.201713] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1685.537716] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1685.537997] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1685.538215] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5e3dc077-f593-4eb7-b8f1-868f7e7fdd37 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.569310] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1685.569501] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Fetch image to [datastore2] vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1685.569673] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1685.570517] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c7ec108-dec9-4be0-8546-4c54ca23e69b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.577119] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ddd653b-16a9-4156-941b-26cdce26576c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.585724] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ed3f8a-e96f-4eae-9365-e7c8b15a7b75 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.616486] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92dfa615-ec59-44a4-812c-3c47be3dcedc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.622112] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-58ec9b9f-f6fe-4c07-8040-d81e1c1ba55f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1685.646083] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1685.709058] env[61439]: DEBUG oslo_vmware.rw_handles [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1685.765887] env[61439]: DEBUG nova.compute.manager [req-23179833-a4e8-4407-9bc2-bd7d5692ba83 req-9706358b-d533-47ca-ad66-cbe524fc197f service nova] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Received event network-changed-60dd8270-5b09-4a25-9873-20c868b28318 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1685.766095] env[61439]: DEBUG nova.compute.manager [req-23179833-a4e8-4407-9bc2-bd7d5692ba83 req-9706358b-d533-47ca-ad66-cbe524fc197f service nova] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Refreshing instance network info cache due to event network-changed-60dd8270-5b09-4a25-9873-20c868b28318. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1685.766328] env[61439]: DEBUG oslo_concurrency.lockutils [req-23179833-a4e8-4407-9bc2-bd7d5692ba83 req-9706358b-d533-47ca-ad66-cbe524fc197f service nova] Acquiring lock "refresh_cache-77bcc713-fa13-4c98-97b7-7ea1697d0426" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1685.766567] env[61439]: DEBUG oslo_concurrency.lockutils [req-23179833-a4e8-4407-9bc2-bd7d5692ba83 req-9706358b-d533-47ca-ad66-cbe524fc197f service nova] Acquired lock "refresh_cache-77bcc713-fa13-4c98-97b7-7ea1697d0426" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1685.766635] env[61439]: DEBUG nova.network.neutron [req-23179833-a4e8-4407-9bc2-bd7d5692ba83 req-9706358b-d533-47ca-ad66-cbe524fc197f service nova] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Refreshing network info cache for port 60dd8270-5b09-4a25-9873-20c868b28318 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1685.769921] env[61439]: DEBUG oslo_vmware.rw_handles [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1685.770107] env[61439]: DEBUG oslo_vmware.rw_handles [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1685.998183] env[61439]: DEBUG nova.network.neutron [req-23179833-a4e8-4407-9bc2-bd7d5692ba83 req-9706358b-d533-47ca-ad66-cbe524fc197f service nova] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Updated VIF entry in instance network info cache for port 60dd8270-5b09-4a25-9873-20c868b28318. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1685.998522] env[61439]: DEBUG nova.network.neutron [req-23179833-a4e8-4407-9bc2-bd7d5692ba83 req-9706358b-d533-47ca-ad66-cbe524fc197f service nova] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Updating instance_info_cache with network_info: [{"id": "60dd8270-5b09-4a25-9873-20c868b28318", "address": "fa:16:3e:44:36:fd", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap60dd8270-5b", "ovs_interfaceid": "60dd8270-5b09-4a25-9873-20c868b28318", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1686.007379] env[61439]: DEBUG oslo_concurrency.lockutils [req-23179833-a4e8-4407-9bc2-bd7d5692ba83 req-9706358b-d533-47ca-ad66-cbe524fc197f service nova] Releasing lock "refresh_cache-77bcc713-fa13-4c98-97b7-7ea1697d0426" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1686.197083] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1686.201682] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1686.201872] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1686.202045] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1687.201981] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1687.202321] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1687.202321] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1687.215102] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1687.215264] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1688.201649] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1692.198324] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.021568] env[61439]: WARNING oslo_vmware.rw_handles [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1733.021568] env[61439]: ERROR oslo_vmware.rw_handles [ 1733.022418] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1733.024158] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1733.024416] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Copying Virtual Disk [datastore2] vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/c1408d20-162d-4781-a29c-c962d6e2ded7/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1733.024697] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-00b22d8b-243b-4f6c-adcf-7af090c0c09e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.034044] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1733.034044] env[61439]: value = "task-987778" [ 1733.034044] env[61439]: _type = "Task" [ 1733.034044] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1733.042048] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987778, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1733.544011] env[61439]: DEBUG oslo_vmware.exceptions [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1733.544289] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1733.544872] env[61439]: ERROR nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1733.544872] env[61439]: Faults: ['InvalidArgument'] [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Traceback (most recent call last): [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] yield resources [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] self.driver.spawn(context, instance, image_meta, [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] self._fetch_image_if_missing(context, vi) [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] image_cache(vi, tmp_image_ds_loc) [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] vm_util.copy_virtual_disk( [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] session._wait_for_task(vmdk_copy_task) [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] return self.wait_for_task(task_ref) [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] return evt.wait() [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] result = hub.switch() [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] return self.greenlet.switch() [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] self.f(*self.args, **self.kw) [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] raise exceptions.translate_fault(task_info.error) [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Faults: ['InvalidArgument'] [ 1733.544872] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] [ 1733.546052] env[61439]: INFO nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Terminating instance [ 1733.548070] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1733.548266] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1733.548988] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e83e8efe-4197-4eec-9606-ffbd45055a1c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.556267] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1733.556484] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-587a4b92-6d8f-4a7e-82c4-efca931ffbfe {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.635124] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1733.635367] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1733.635565] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleting the datastore file [datastore2] 77bcc713-fa13-4c98-97b7-7ea1697d0426 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1733.635833] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-26b256d0-5e0c-45eb-9e2b-08a3541136a9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1733.642865] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1733.642865] env[61439]: value = "task-987780" [ 1733.642865] env[61439]: _type = "Task" [ 1733.642865] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1733.650456] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987780, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1734.152171] env[61439]: DEBUG oslo_vmware.api [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987780, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085866} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1734.152542] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1734.152644] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1734.152808] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1734.152990] env[61439]: INFO nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1734.155058] env[61439]: DEBUG nova.compute.claims [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1734.155236] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1734.155452] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1734.226094] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2756bdc-1ad9-4f42-ace6-d282d9b010c5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1734.233632] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b60f302-fc1d-43db-a324-f72e8565cf14 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1734.262785] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e377efae-dc53-430f-b1ed-2c21fe762dc2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1734.269974] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac8ea94-fb92-4c2c-8514-4bcb6c844c50 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1734.282680] env[61439]: DEBUG nova.compute.provider_tree [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1734.291421] env[61439]: DEBUG nova.scheduler.client.report [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1734.305043] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.149s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1734.305560] env[61439]: ERROR nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1734.305560] env[61439]: Faults: ['InvalidArgument'] [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Traceback (most recent call last): [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] self.driver.spawn(context, instance, image_meta, [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] self._fetch_image_if_missing(context, vi) [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] image_cache(vi, tmp_image_ds_loc) [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] vm_util.copy_virtual_disk( [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] session._wait_for_task(vmdk_copy_task) [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] return self.wait_for_task(task_ref) [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] return evt.wait() [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] result = hub.switch() [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] return self.greenlet.switch() [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] self.f(*self.args, **self.kw) [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] raise exceptions.translate_fault(task_info.error) [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Faults: ['InvalidArgument'] [ 1734.305560] env[61439]: ERROR nova.compute.manager [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] [ 1734.306776] env[61439]: DEBUG nova.compute.utils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1734.307642] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Build of instance 77bcc713-fa13-4c98-97b7-7ea1697d0426 was re-scheduled: A specified parameter was not correct: fileType [ 1734.307642] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1734.308008] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1734.308194] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1734.308367] env[61439]: DEBUG nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1734.308535] env[61439]: DEBUG nova.network.neutron [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1734.578391] env[61439]: DEBUG nova.network.neutron [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1734.594526] env[61439]: INFO nova.compute.manager [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 77bcc713-fa13-4c98-97b7-7ea1697d0426] Took 0.29 seconds to deallocate network for instance. [ 1734.682719] env[61439]: INFO nova.scheduler.client.report [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted allocations for instance 77bcc713-fa13-4c98-97b7-7ea1697d0426 [ 1734.703500] env[61439]: DEBUG oslo_concurrency.lockutils [None req-36f76f82-7da8-4380-bbd5-128f92b9eb8c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "77bcc713-fa13-4c98-97b7-7ea1697d0426" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.451s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1736.041131] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "53c6f49c-42cc-49c3-a3b4-517a96f0502c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1736.041416] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "53c6f49c-42cc-49c3-a3b4-517a96f0502c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1736.050806] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1736.096877] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1736.097136] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1736.098514] env[61439]: INFO nova.compute.claims [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1736.166430] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad46fa0e-2f0e-483a-8eb9-fb5be530bbc6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1736.174303] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22c4b475-5f80-4ae6-9ebe-a8b2ad3c27ae {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1736.205008] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cb88d30-5b84-4c57-8093-14396030dfba {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1736.212071] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3262b90e-904d-4a0b-8466-c92e8298bfc4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1736.224909] env[61439]: DEBUG nova.compute.provider_tree [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1736.233575] env[61439]: DEBUG nova.scheduler.client.report [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1736.245771] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.149s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1736.246216] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1736.276243] env[61439]: DEBUG nova.compute.utils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1736.277381] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1736.277549] env[61439]: DEBUG nova.network.neutron [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1736.285887] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1736.346982] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1736.351243] env[61439]: DEBUG nova.policy [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf4545757716483485ca9b60bd689a1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e839303682f748a4b5a42c8a9273e388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1736.370483] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1736.370716] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1736.370877] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1736.371068] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1736.371217] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1736.371364] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1736.371564] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1736.371723] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1736.371895] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1736.372068] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1736.372250] env[61439]: DEBUG nova.virt.hardware [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1736.373129] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b93a553-3f39-4a94-8bd0-56f4ec909078 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1736.380756] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f521e9a5-5cad-4e96-b547-09d7af565668 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1736.636925] env[61439]: DEBUG nova.network.neutron [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Successfully created port: 3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1737.176814] env[61439]: DEBUG nova.compute.manager [req-8dc59cce-1d15-4920-8d42-743bac2b2185 req-adbfa2ee-cc3a-4076-8e89-63c880789a72 service nova] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Received event network-vif-plugged-3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1737.177064] env[61439]: DEBUG oslo_concurrency.lockutils [req-8dc59cce-1d15-4920-8d42-743bac2b2185 req-adbfa2ee-cc3a-4076-8e89-63c880789a72 service nova] Acquiring lock "53c6f49c-42cc-49c3-a3b4-517a96f0502c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1737.177266] env[61439]: DEBUG oslo_concurrency.lockutils [req-8dc59cce-1d15-4920-8d42-743bac2b2185 req-adbfa2ee-cc3a-4076-8e89-63c880789a72 service nova] Lock "53c6f49c-42cc-49c3-a3b4-517a96f0502c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1737.177435] env[61439]: DEBUG oslo_concurrency.lockutils [req-8dc59cce-1d15-4920-8d42-743bac2b2185 req-adbfa2ee-cc3a-4076-8e89-63c880789a72 service nova] Lock "53c6f49c-42cc-49c3-a3b4-517a96f0502c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1737.177603] env[61439]: DEBUG nova.compute.manager [req-8dc59cce-1d15-4920-8d42-743bac2b2185 req-adbfa2ee-cc3a-4076-8e89-63c880789a72 service nova] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] No waiting events found dispatching network-vif-plugged-3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1737.177771] env[61439]: WARNING nova.compute.manager [req-8dc59cce-1d15-4920-8d42-743bac2b2185 req-adbfa2ee-cc3a-4076-8e89-63c880789a72 service nova] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Received unexpected event network-vif-plugged-3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be for instance with vm_state building and task_state spawning. [ 1737.256315] env[61439]: DEBUG nova.network.neutron [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Successfully updated port: 3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1737.265996] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "refresh_cache-53c6f49c-42cc-49c3-a3b4-517a96f0502c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1737.266170] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "refresh_cache-53c6f49c-42cc-49c3-a3b4-517a96f0502c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1737.266321] env[61439]: DEBUG nova.network.neutron [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1737.305330] env[61439]: DEBUG nova.network.neutron [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1737.459614] env[61439]: DEBUG nova.network.neutron [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Updating instance_info_cache with network_info: [{"id": "3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be", "address": "fa:16:3e:71:c2:33", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f3d6c4e-c6", "ovs_interfaceid": "3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1737.472496] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "refresh_cache-53c6f49c-42cc-49c3-a3b4-517a96f0502c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1737.472814] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Instance network_info: |[{"id": "3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be", "address": "fa:16:3e:71:c2:33", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f3d6c4e-c6", "ovs_interfaceid": "3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1737.473218] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:71:c2:33', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '779b8e65-8b9e-427e-af08-910febd65bfa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1737.480603] env[61439]: DEBUG oslo.service.loopingcall [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1737.481130] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1737.481413] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f9004d77-3e98-4c84-a063-e17a70b2d09f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.502231] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1737.502231] env[61439]: value = "task-987781" [ 1737.502231] env[61439]: _type = "Task" [ 1737.502231] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1737.510246] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987781, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1738.014230] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987781, 'name': CreateVM_Task} progress is 99%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1738.513472] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987781, 'name': CreateVM_Task} progress is 99%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1739.014471] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987781, 'name': CreateVM_Task, 'duration_secs': 1.321411} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1739.014896] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1739.015682] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1739.016034] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1739.016468] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1739.016824] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5505691f-1a48-4e15-b17b-5b5142f70873 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1739.021150] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1739.021150] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5238657c-72ab-0207-8e00-8558fac54c83" [ 1739.021150] env[61439]: _type = "Task" [ 1739.021150] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1739.032103] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5238657c-72ab-0207-8e00-8558fac54c83, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1739.210920] env[61439]: DEBUG nova.compute.manager [req-a665583d-d866-4569-919b-bf4693735d61 req-e4c3083b-f241-4927-ba40-aeda18f25b6b service nova] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Received event network-changed-3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1739.211141] env[61439]: DEBUG nova.compute.manager [req-a665583d-d866-4569-919b-bf4693735d61 req-e4c3083b-f241-4927-ba40-aeda18f25b6b service nova] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Refreshing instance network info cache due to event network-changed-3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1739.211357] env[61439]: DEBUG oslo_concurrency.lockutils [req-a665583d-d866-4569-919b-bf4693735d61 req-e4c3083b-f241-4927-ba40-aeda18f25b6b service nova] Acquiring lock "refresh_cache-53c6f49c-42cc-49c3-a3b4-517a96f0502c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1739.211502] env[61439]: DEBUG oslo_concurrency.lockutils [req-a665583d-d866-4569-919b-bf4693735d61 req-e4c3083b-f241-4927-ba40-aeda18f25b6b service nova] Acquired lock "refresh_cache-53c6f49c-42cc-49c3-a3b4-517a96f0502c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1739.211662] env[61439]: DEBUG nova.network.neutron [req-a665583d-d866-4569-919b-bf4693735d61 req-e4c3083b-f241-4927-ba40-aeda18f25b6b service nova] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Refreshing network info cache for port 3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1739.461715] env[61439]: DEBUG nova.network.neutron [req-a665583d-d866-4569-919b-bf4693735d61 req-e4c3083b-f241-4927-ba40-aeda18f25b6b service nova] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Updated VIF entry in instance network info cache for port 3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1739.462088] env[61439]: DEBUG nova.network.neutron [req-a665583d-d866-4569-919b-bf4693735d61 req-e4c3083b-f241-4927-ba40-aeda18f25b6b service nova] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Updating instance_info_cache with network_info: [{"id": "3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be", "address": "fa:16:3e:71:c2:33", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f3d6c4e-c6", "ovs_interfaceid": "3f3d6c4e-c6c9-4d18-b1a4-57c765fbb3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1739.472705] env[61439]: DEBUG oslo_concurrency.lockutils [req-a665583d-d866-4569-919b-bf4693735d61 req-e4c3083b-f241-4927-ba40-aeda18f25b6b service nova] Releasing lock "refresh_cache-53c6f49c-42cc-49c3-a3b4-517a96f0502c" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1739.532017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1739.532351] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1739.532504] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1739.532653] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1739.532866] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1739.533129] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3fb651db-4b1b-4859-a374-7817f8d60fe4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1739.540688] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1739.540876] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1739.541541] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c27e4c77-7a2b-4001-ae0d-ef0e86423954 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1739.546259] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1739.546259] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52a29a5c-e253-b34f-1098-74fa07cfb93d" [ 1739.546259] env[61439]: _type = "Task" [ 1739.546259] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1739.553496] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52a29a5c-e253-b34f-1098-74fa07cfb93d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1740.056182] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1740.056444] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1740.056669] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cd431ac3-863f-4007-b6d6-2742f6cbf346 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.075045] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1740.075243] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Fetch image to [datastore2] vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1740.075416] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1740.076152] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bdaafc3-47ce-45f7-af1d-8097b75635b4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.082636] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33399449-8d79-4988-87b2-d857eadd358e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.091462] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77bb9c2f-1148-4ea8-bc36-2e27865f3834 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.122206] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92accf5b-465b-417a-83b5-53dd315de35b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.127439] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7bfe6029-2046-4a20-a624-a528d92e239f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1740.148913] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1740.194282] env[61439]: DEBUG oslo_vmware.rw_handles [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1740.254158] env[61439]: DEBUG oslo_vmware.rw_handles [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1740.254350] env[61439]: DEBUG oslo_vmware.rw_handles [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1744.202070] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1744.213087] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1744.213310] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1744.213479] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1744.213639] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1744.214769] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4735a05-4f83-4f97-b0df-4db50592b87d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.223341] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dc50792-91cb-42ca-ac11-c9d6e5d38c26 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.236672] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e588f69-4b45-41ce-995f-61d0b4f1ee27 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.242610] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c37e3b6-db73-43cd-b531-00d74ba2eae3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.270291] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181567MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1744.270428] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1744.270625] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1744.310837] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 53c6f49c-42cc-49c3-a3b4-517a96f0502c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1744.311057] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1744.311210] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1744.338566] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2682adaf-d701-4efc-9586-49e2f227042c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.345831] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec957741-bb42-4c24-ae59-89b263effc25 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.375325] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7502530-e78e-45bf-8ec3-5b7c55c04e8d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.382144] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d8cb8a9-3c73-4914-8cc6-07409b7d4e56 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1744.395018] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1744.403207] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1744.415532] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1744.415714] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1745.415626] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1746.201850] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1746.202163] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1746.202326] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1746.202479] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1747.197657] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1747.201285] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1747.201448] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1747.201572] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1747.212296] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1747.212458] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1748.201967] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1748.202363] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1788.039741] env[61439]: WARNING oslo_vmware.rw_handles [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1788.039741] env[61439]: ERROR oslo_vmware.rw_handles [ 1788.040480] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1788.042199] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1788.042459] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Copying Virtual Disk [datastore2] vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/b6057a4f-3a28-4c9e-aea1-67f9b5e550bd/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1788.042823] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6d3bd18d-543d-4cbe-8be9-ea778a9dd6e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1788.052901] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1788.052901] env[61439]: value = "task-987782" [ 1788.052901] env[61439]: _type = "Task" [ 1788.052901] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1788.060499] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987782, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1788.563453] env[61439]: DEBUG oslo_vmware.exceptions [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1788.563749] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1788.564327] env[61439]: ERROR nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1788.564327] env[61439]: Faults: ['InvalidArgument'] [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Traceback (most recent call last): [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] yield resources [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] self.driver.spawn(context, instance, image_meta, [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] self._fetch_image_if_missing(context, vi) [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] image_cache(vi, tmp_image_ds_loc) [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] vm_util.copy_virtual_disk( [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] session._wait_for_task(vmdk_copy_task) [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] return self.wait_for_task(task_ref) [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] return evt.wait() [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] result = hub.switch() [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] return self.greenlet.switch() [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] self.f(*self.args, **self.kw) [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] raise exceptions.translate_fault(task_info.error) [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Faults: ['InvalidArgument'] [ 1788.564327] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] [ 1788.565284] env[61439]: INFO nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Terminating instance [ 1788.567599] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1788.567790] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1788.568535] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab3b602d-4b87-4ce9-afac-5350be38f8dd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1788.574930] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1788.575173] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cd5c5248-4255-49d1-8da2-531663957add {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1788.638935] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1788.639179] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1788.639347] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleting the datastore file [datastore2] 53c6f49c-42cc-49c3-a3b4-517a96f0502c {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1788.639620] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-05e00ac3-864d-4f16-a8d1-e996dfea90b6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1788.645306] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1788.645306] env[61439]: value = "task-987784" [ 1788.645306] env[61439]: _type = "Task" [ 1788.645306] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1788.652176] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987784, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1789.156124] env[61439]: DEBUG oslo_vmware.api [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987784, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062807} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1789.156552] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1789.156552] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1789.156668] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1789.156842] env[61439]: INFO nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1789.159111] env[61439]: DEBUG nova.compute.claims [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1789.159298] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1789.159504] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1789.231292] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79a6a952-89d4-4698-8b1b-c921a84b5253 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.239743] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cb75b81-3a88-4059-afb5-e25b827c278c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.270030] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea593fd3-a3f9-4a8a-99ad-15b8e03361e8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.278023] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bafec8d-fd40-49a0-990c-d464515cf975 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1789.291328] env[61439]: DEBUG nova.compute.provider_tree [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1789.300419] env[61439]: DEBUG nova.scheduler.client.report [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1789.315316] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.156s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1789.315865] env[61439]: ERROR nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1789.315865] env[61439]: Faults: ['InvalidArgument'] [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Traceback (most recent call last): [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] self.driver.spawn(context, instance, image_meta, [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] self._fetch_image_if_missing(context, vi) [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] image_cache(vi, tmp_image_ds_loc) [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] vm_util.copy_virtual_disk( [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] session._wait_for_task(vmdk_copy_task) [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] return self.wait_for_task(task_ref) [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] return evt.wait() [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] result = hub.switch() [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] return self.greenlet.switch() [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] self.f(*self.args, **self.kw) [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] raise exceptions.translate_fault(task_info.error) [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Faults: ['InvalidArgument'] [ 1789.315865] env[61439]: ERROR nova.compute.manager [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] [ 1789.316899] env[61439]: DEBUG nova.compute.utils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1789.318237] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Build of instance 53c6f49c-42cc-49c3-a3b4-517a96f0502c was re-scheduled: A specified parameter was not correct: fileType [ 1789.318237] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1789.318620] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1789.318794] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1789.318971] env[61439]: DEBUG nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1789.319153] env[61439]: DEBUG nova.network.neutron [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1789.579008] env[61439]: DEBUG nova.network.neutron [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1789.590021] env[61439]: INFO nova.compute.manager [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 53c6f49c-42cc-49c3-a3b4-517a96f0502c] Took 0.27 seconds to deallocate network for instance. [ 1789.681513] env[61439]: INFO nova.scheduler.client.report [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted allocations for instance 53c6f49c-42cc-49c3-a3b4-517a96f0502c [ 1789.699344] env[61439]: DEBUG oslo_concurrency.lockutils [None req-82e51024-9f2d-4582-a8a9-9b00c143581c tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "53c6f49c-42cc-49c3-a3b4-517a96f0502c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.658s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1790.787444] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1790.787765] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1790.797123] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1790.842470] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1790.842735] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1790.844179] env[61439]: INFO nova.compute.claims [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1790.909312] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6c7a81b-5690-4326-a32a-b88ec0350ae5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.916651] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a75d65a-1c0b-46b3-ad97-c034094b9c8b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.947168] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea199db-4a91-4744-a7cb-df1318abaa36 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.953936] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56cf301e-3d89-4ec1-88cf-78d50e0967d1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1790.967145] env[61439]: DEBUG nova.compute.provider_tree [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1790.975982] env[61439]: DEBUG nova.scheduler.client.report [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1790.987622] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.145s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1790.988070] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1791.016970] env[61439]: DEBUG nova.compute.utils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1791.018074] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1791.018244] env[61439]: DEBUG nova.network.neutron [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1791.027978] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1791.084370] env[61439]: DEBUG nova.policy [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf4545757716483485ca9b60bd689a1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e839303682f748a4b5a42c8a9273e388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1791.089779] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1791.114490] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1791.114770] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1791.114964] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1791.115238] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1791.115462] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1791.115744] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1791.115921] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1791.116155] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1791.116377] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1791.116598] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1791.116873] env[61439]: DEBUG nova.virt.hardware [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1791.117933] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe484c8b-9836-4281-be1a-08646bc18224 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.127870] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1c2a1f1-60e5-413c-8df8-95f779f6511b {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1791.416010] env[61439]: DEBUG nova.network.neutron [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Successfully created port: b0101f31-74be-43ab-9744-a2abb049b4c4 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1791.901148] env[61439]: DEBUG nova.compute.manager [req-9d8493e9-dae4-4bfb-82a4-0ac50aed4712 req-f0e678fb-5fb6-4f91-9b0f-7c97a915ec45 service nova] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Received event network-vif-plugged-b0101f31-74be-43ab-9744-a2abb049b4c4 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1791.901422] env[61439]: DEBUG oslo_concurrency.lockutils [req-9d8493e9-dae4-4bfb-82a4-0ac50aed4712 req-f0e678fb-5fb6-4f91-9b0f-7c97a915ec45 service nova] Acquiring lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1791.901580] env[61439]: DEBUG oslo_concurrency.lockutils [req-9d8493e9-dae4-4bfb-82a4-0ac50aed4712 req-f0e678fb-5fb6-4f91-9b0f-7c97a915ec45 service nova] Lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1791.901748] env[61439]: DEBUG oslo_concurrency.lockutils [req-9d8493e9-dae4-4bfb-82a4-0ac50aed4712 req-f0e678fb-5fb6-4f91-9b0f-7c97a915ec45 service nova] Lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1791.901924] env[61439]: DEBUG nova.compute.manager [req-9d8493e9-dae4-4bfb-82a4-0ac50aed4712 req-f0e678fb-5fb6-4f91-9b0f-7c97a915ec45 service nova] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] No waiting events found dispatching network-vif-plugged-b0101f31-74be-43ab-9744-a2abb049b4c4 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1791.902143] env[61439]: WARNING nova.compute.manager [req-9d8493e9-dae4-4bfb-82a4-0ac50aed4712 req-f0e678fb-5fb6-4f91-9b0f-7c97a915ec45 service nova] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Received unexpected event network-vif-plugged-b0101f31-74be-43ab-9744-a2abb049b4c4 for instance with vm_state building and task_state spawning. [ 1791.982228] env[61439]: DEBUG nova.network.neutron [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Successfully updated port: b0101f31-74be-43ab-9744-a2abb049b4c4 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1791.993962] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "refresh_cache-d25ad8a1-1f52-4e6b-93a5-629420a06e66" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1791.994178] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "refresh_cache-d25ad8a1-1f52-4e6b-93a5-629420a06e66" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1791.994382] env[61439]: DEBUG nova.network.neutron [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1792.032212] env[61439]: DEBUG nova.network.neutron [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1792.192645] env[61439]: DEBUG nova.network.neutron [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Updating instance_info_cache with network_info: [{"id": "b0101f31-74be-43ab-9744-a2abb049b4c4", "address": "fa:16:3e:d6:ca:e1", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0101f31-74", "ovs_interfaceid": "b0101f31-74be-43ab-9744-a2abb049b4c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1792.203644] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "refresh_cache-d25ad8a1-1f52-4e6b-93a5-629420a06e66" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1792.203974] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Instance network_info: |[{"id": "b0101f31-74be-43ab-9744-a2abb049b4c4", "address": "fa:16:3e:d6:ca:e1", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0101f31-74", "ovs_interfaceid": "b0101f31-74be-43ab-9744-a2abb049b4c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1792.204374] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d6:ca:e1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '779b8e65-8b9e-427e-af08-910febd65bfa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b0101f31-74be-43ab-9744-a2abb049b4c4', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1792.211900] env[61439]: DEBUG oslo.service.loopingcall [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1792.212370] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1792.212591] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0ffede2e-a4ce-4f03-8984-4474365988c8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.232634] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1792.232634] env[61439]: value = "task-987785" [ 1792.232634] env[61439]: _type = "Task" [ 1792.232634] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1792.240334] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987785, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1792.744722] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987785, 'name': CreateVM_Task, 'duration_secs': 0.291566} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1792.744990] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1792.745654] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1792.745818] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1792.746152] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1792.746398] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d9a51aa4-950d-4ee6-a417-3fd182c97f6d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1792.750835] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1792.750835] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52ec3064-5253-30aa-f396-4b22bad8e47d" [ 1792.750835] env[61439]: _type = "Task" [ 1792.750835] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1792.758236] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52ec3064-5253-30aa-f396-4b22bad8e47d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1793.261421] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1793.261810] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1793.262157] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1793.262402] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1793.262534] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1793.262801] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-08c57c72-60bf-49cb-9e22-30957af0d588 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.279033] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1793.279223] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1793.279932] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cefa7b5c-1bb9-4be7-ad58-107f74d6c5a0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.284885] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1793.284885] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52d61021-1a73-db37-5656-a27dbaf3f774" [ 1793.284885] env[61439]: _type = "Task" [ 1793.284885] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1793.292081] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52d61021-1a73-db37-5656-a27dbaf3f774, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1793.795534] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1793.795534] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1793.795748] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ab3d5102-59c4-4538-a8e6-b72ca20a29ad {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.815937] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1793.816141] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Fetch image to [datastore2] vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1793.816314] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1793.817034] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93e6483a-87fd-4440-bbcc-b6c1e3e71a8c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.823775] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e0be3dc-af3c-4668-a875-0bbcaffc09be {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.832501] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb2118c3-48da-4223-ab2c-82ac9cc99116 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.862477] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b81bd950-ca99-45e3-a647-00f61fc21c94 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.868597] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-53c8cab7-7a50-4010-9b4f-c2314f60b966 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.888325] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1793.932960] env[61439]: DEBUG nova.compute.manager [req-610d9490-3c14-4ad6-a174-cc076d91d3f7 req-4f57e7c5-9e07-4eed-9706-90257ed5a53c service nova] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Received event network-changed-b0101f31-74be-43ab-9744-a2abb049b4c4 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1793.933162] env[61439]: DEBUG nova.compute.manager [req-610d9490-3c14-4ad6-a174-cc076d91d3f7 req-4f57e7c5-9e07-4eed-9706-90257ed5a53c service nova] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Refreshing instance network info cache due to event network-changed-b0101f31-74be-43ab-9744-a2abb049b4c4. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1793.933382] env[61439]: DEBUG oslo_concurrency.lockutils [req-610d9490-3c14-4ad6-a174-cc076d91d3f7 req-4f57e7c5-9e07-4eed-9706-90257ed5a53c service nova] Acquiring lock "refresh_cache-d25ad8a1-1f52-4e6b-93a5-629420a06e66" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1793.933524] env[61439]: DEBUG oslo_concurrency.lockutils [req-610d9490-3c14-4ad6-a174-cc076d91d3f7 req-4f57e7c5-9e07-4eed-9706-90257ed5a53c service nova] Acquired lock "refresh_cache-d25ad8a1-1f52-4e6b-93a5-629420a06e66" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1793.933681] env[61439]: DEBUG nova.network.neutron [req-610d9490-3c14-4ad6-a174-cc076d91d3f7 req-4f57e7c5-9e07-4eed-9706-90257ed5a53c service nova] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Refreshing network info cache for port b0101f31-74be-43ab-9744-a2abb049b4c4 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1793.938341] env[61439]: DEBUG oslo_vmware.rw_handles [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1793.997303] env[61439]: DEBUG oslo_vmware.rw_handles [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1793.997486] env[61439]: DEBUG oslo_vmware.rw_handles [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1794.216974] env[61439]: DEBUG nova.network.neutron [req-610d9490-3c14-4ad6-a174-cc076d91d3f7 req-4f57e7c5-9e07-4eed-9706-90257ed5a53c service nova] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Updated VIF entry in instance network info cache for port b0101f31-74be-43ab-9744-a2abb049b4c4. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1794.217344] env[61439]: DEBUG nova.network.neutron [req-610d9490-3c14-4ad6-a174-cc076d91d3f7 req-4f57e7c5-9e07-4eed-9706-90257ed5a53c service nova] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Updating instance_info_cache with network_info: [{"id": "b0101f31-74be-43ab-9744-a2abb049b4c4", "address": "fa:16:3e:d6:ca:e1", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0101f31-74", "ovs_interfaceid": "b0101f31-74be-43ab-9744-a2abb049b4c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1794.229278] env[61439]: DEBUG oslo_concurrency.lockutils [req-610d9490-3c14-4ad6-a174-cc076d91d3f7 req-4f57e7c5-9e07-4eed-9706-90257ed5a53c service nova] Releasing lock "refresh_cache-d25ad8a1-1f52-4e6b-93a5-629420a06e66" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1797.203262] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1801.214232] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1801.214618] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Cleaning up deleted instances with incomplete migration {{(pid=61439) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1805.211944] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1805.212332] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1805.223273] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1805.223487] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1805.223653] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1805.223814] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1805.224888] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ec3ecc1-cfd3-436c-82aa-f27f158c0ccc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.233502] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1f4b95c-74fd-40db-9e3b-4a945e3d5144 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.246839] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b2453e0-fc85-4800-9e6a-a4faa75fabbd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.252819] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccb3ebdf-b265-4dcf-8fdf-2e5adad9f4ea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.281900] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181565MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1805.282057] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1805.282250] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1805.321458] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance d25ad8a1-1f52-4e6b-93a5-629420a06e66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1805.321653] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1805.321799] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1805.347315] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cb67679-b70b-43ca-9066-e96bdc7cd699 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.354808] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a482dbff-5e42-47df-81ba-55e5f2c5b3e1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.383796] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08916f0a-1b1c-4397-89a0-6e93614c84fc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.390668] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc719464-65e4-4880-9dbb-52d0c00c32b3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.403268] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1805.411011] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1805.422864] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1805.423066] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1806.413122] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1806.413505] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1806.413558] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1807.201521] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1807.201702] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1807.201830] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1807.211416] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1807.211568] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1807.211793] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1807.211930] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Cleaning up deleted instances {{(pid=61439) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1807.219629] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] There are 0 instances to clean {{(pid=61439) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1808.210497] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1808.210871] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1810.201600] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1810.201978] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1814.198795] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1830.246056] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_power_states {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1830.256563] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Getting list of instances from cluster (obj){ [ 1830.256563] env[61439]: value = "domain-c8" [ 1830.256563] env[61439]: _type = "ClusterComputeResource" [ 1830.256563] env[61439]: } {{(pid=61439) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1830.257981] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-344c55ed-0e3d-4666-8631-6ae1fc83108e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.267571] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Got total of 1 instances {{(pid=61439) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1830.267734] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Triggering sync for uuid d25ad8a1-1f52-4e6b-93a5-629420a06e66 {{(pid=61439) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1830.268066] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1842.895836] env[61439]: WARNING oslo_vmware.rw_handles [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1842.895836] env[61439]: ERROR oslo_vmware.rw_handles [ 1842.896641] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1842.898283] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1842.898510] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Copying Virtual Disk [datastore2] vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/924d4a51-1787-4267-950f-2a43edfe766b/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1842.898786] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-33c4bff8-a9b9-4260-9eb3-8be7ef5c8bf7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1842.906410] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1842.906410] env[61439]: value = "task-987786" [ 1842.906410] env[61439]: _type = "Task" [ 1842.906410] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1842.914226] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987786, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1843.417659] env[61439]: DEBUG oslo_vmware.exceptions [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1843.417950] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1843.418486] env[61439]: ERROR nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1843.418486] env[61439]: Faults: ['InvalidArgument'] [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Traceback (most recent call last): [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] yield resources [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] self.driver.spawn(context, instance, image_meta, [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] self._fetch_image_if_missing(context, vi) [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] image_cache(vi, tmp_image_ds_loc) [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] vm_util.copy_virtual_disk( [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] session._wait_for_task(vmdk_copy_task) [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] return self.wait_for_task(task_ref) [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] return evt.wait() [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] result = hub.switch() [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] return self.greenlet.switch() [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] self.f(*self.args, **self.kw) [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] raise exceptions.translate_fault(task_info.error) [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Faults: ['InvalidArgument'] [ 1843.418486] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] [ 1843.419675] env[61439]: INFO nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Terminating instance [ 1843.421708] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1843.421910] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1843.422668] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-684b6837-d032-4afe-a625-d3ebeea8d427 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.429255] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1843.429463] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9f460523-e3d1-4294-8df7-f162e738a85d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.502280] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1843.502507] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1843.502688] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleting the datastore file [datastore2] d25ad8a1-1f52-4e6b-93a5-629420a06e66 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1843.503038] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8d1fe57a-563d-419d-8d78-fa8872c0d922 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.508710] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1843.508710] env[61439]: value = "task-987788" [ 1843.508710] env[61439]: _type = "Task" [ 1843.508710] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1843.515992] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987788, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1844.018824] env[61439]: DEBUG oslo_vmware.api [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987788, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074009} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1844.019166] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1844.019258] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1844.019408] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1844.019582] env[61439]: INFO nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1844.021664] env[61439]: DEBUG nova.compute.claims [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1844.021840] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1844.022081] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1844.088986] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57b1d8fe-6fd2-47d8-8e5b-456a069b00ef {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.096074] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55964cc2-cdb8-4bc6-8a27-d0e06604e0e2 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.125877] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21955f80-48a0-42fb-b017-bf6ce49f49d7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.132525] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e4487be-11a2-4520-a162-b4b585e38f60 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.144948] env[61439]: DEBUG nova.compute.provider_tree [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1844.153211] env[61439]: DEBUG nova.scheduler.client.report [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1844.165580] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.143s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1844.166119] env[61439]: ERROR nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1844.166119] env[61439]: Faults: ['InvalidArgument'] [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Traceback (most recent call last): [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] self.driver.spawn(context, instance, image_meta, [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] self._fetch_image_if_missing(context, vi) [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] image_cache(vi, tmp_image_ds_loc) [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] vm_util.copy_virtual_disk( [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] session._wait_for_task(vmdk_copy_task) [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] return self.wait_for_task(task_ref) [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] return evt.wait() [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] result = hub.switch() [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] return self.greenlet.switch() [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] self.f(*self.args, **self.kw) [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] raise exceptions.translate_fault(task_info.error) [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Faults: ['InvalidArgument'] [ 1844.166119] env[61439]: ERROR nova.compute.manager [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] [ 1844.167611] env[61439]: DEBUG nova.compute.utils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1844.168099] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Build of instance d25ad8a1-1f52-4e6b-93a5-629420a06e66 was re-scheduled: A specified parameter was not correct: fileType [ 1844.168099] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1844.168481] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1844.168653] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1844.168826] env[61439]: DEBUG nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1844.169006] env[61439]: DEBUG nova.network.neutron [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1844.448538] env[61439]: DEBUG nova.network.neutron [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1844.460080] env[61439]: INFO nova.compute.manager [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] Took 0.29 seconds to deallocate network for instance. [ 1844.550142] env[61439]: INFO nova.scheduler.client.report [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted allocations for instance d25ad8a1-1f52-4e6b-93a5-629420a06e66 [ 1844.575799] env[61439]: DEBUG oslo_concurrency.lockutils [None req-4e09a1e0-3fc1-489c-bf85-295f8f61fc20 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.788s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1844.576066] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 14.308s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1844.576252] env[61439]: INFO nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: d25ad8a1-1f52-4e6b-93a5-629420a06e66] During sync_power_state the instance has a pending task (spawning). Skip. [ 1844.576425] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "d25ad8a1-1f52-4e6b-93a5-629420a06e66" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.664715] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "4cbfa0ff-1a92-4d02-9929-b05931142d19" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.665017] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "4cbfa0ff-1a92-4d02-9929-b05931142d19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.674929] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1845.720941] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.721219] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.722571] env[61439]: INFO nova.compute.claims [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1845.790528] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d5dfe80-2998-4b78-b9e1-475c97dc9aa4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.798094] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-771859cb-8c2b-4e7d-831b-9fcda6d7cc48 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.827250] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8687a3f-03b8-400a-b4d8-beccba17f2a8 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.834145] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d7fb980-939a-4214-826f-e36baa166748 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.846782] env[61439]: DEBUG nova.compute.provider_tree [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1845.857721] env[61439]: DEBUG nova.scheduler.client.report [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1845.869711] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.148s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.870183] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1845.900791] env[61439]: DEBUG nova.compute.utils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1845.902142] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1845.902318] env[61439]: DEBUG nova.network.neutron [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1845.911534] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1845.968080] env[61439]: DEBUG nova.policy [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf4545757716483485ca9b60bd689a1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e839303682f748a4b5a42c8a9273e388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1845.970995] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1845.994460] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1845.994720] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1845.994858] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1845.995059] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1845.995239] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1845.995393] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1845.995603] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1845.995761] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1845.995936] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1845.996110] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1845.996286] env[61439]: DEBUG nova.virt.hardware [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1845.997143] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a3ecbba-1517-4409-96b2-59fb8a540917 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.006918] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbc2dbe1-db89-4875-95bb-289f77905c16 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.273310] env[61439]: DEBUG nova.network.neutron [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Successfully created port: 470746a2-aa8e-4358-899e-db2856c6efa3 {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1846.829767] env[61439]: DEBUG nova.compute.manager [req-1e9c4d30-6b8e-4335-8f21-652b1926e15e req-f28c23fe-c67b-4973-8b5d-7b6eb691ff77 service nova] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Received event network-vif-plugged-470746a2-aa8e-4358-899e-db2856c6efa3 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1846.830345] env[61439]: DEBUG oslo_concurrency.lockutils [req-1e9c4d30-6b8e-4335-8f21-652b1926e15e req-f28c23fe-c67b-4973-8b5d-7b6eb691ff77 service nova] Acquiring lock "4cbfa0ff-1a92-4d02-9929-b05931142d19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1846.830345] env[61439]: DEBUG oslo_concurrency.lockutils [req-1e9c4d30-6b8e-4335-8f21-652b1926e15e req-f28c23fe-c67b-4973-8b5d-7b6eb691ff77 service nova] Lock "4cbfa0ff-1a92-4d02-9929-b05931142d19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.830596] env[61439]: DEBUG oslo_concurrency.lockutils [req-1e9c4d30-6b8e-4335-8f21-652b1926e15e req-f28c23fe-c67b-4973-8b5d-7b6eb691ff77 service nova] Lock "4cbfa0ff-1a92-4d02-9929-b05931142d19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1846.830596] env[61439]: DEBUG nova.compute.manager [req-1e9c4d30-6b8e-4335-8f21-652b1926e15e req-f28c23fe-c67b-4973-8b5d-7b6eb691ff77 service nova] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] No waiting events found dispatching network-vif-plugged-470746a2-aa8e-4358-899e-db2856c6efa3 {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1846.830771] env[61439]: WARNING nova.compute.manager [req-1e9c4d30-6b8e-4335-8f21-652b1926e15e req-f28c23fe-c67b-4973-8b5d-7b6eb691ff77 service nova] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Received unexpected event network-vif-plugged-470746a2-aa8e-4358-899e-db2856c6efa3 for instance with vm_state building and task_state spawning. [ 1846.905831] env[61439]: DEBUG nova.network.neutron [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Successfully updated port: 470746a2-aa8e-4358-899e-db2856c6efa3 {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1846.916715] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "refresh_cache-4cbfa0ff-1a92-4d02-9929-b05931142d19" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1846.917905] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "refresh_cache-4cbfa0ff-1a92-4d02-9929-b05931142d19" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1846.917905] env[61439]: DEBUG nova.network.neutron [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1846.978654] env[61439]: DEBUG nova.network.neutron [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1847.191594] env[61439]: DEBUG nova.network.neutron [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Updating instance_info_cache with network_info: [{"id": "470746a2-aa8e-4358-899e-db2856c6efa3", "address": "fa:16:3e:e2:7e:bb", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap470746a2-aa", "ovs_interfaceid": "470746a2-aa8e-4358-899e-db2856c6efa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1847.202430] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "refresh_cache-4cbfa0ff-1a92-4d02-9929-b05931142d19" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1847.202794] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Instance network_info: |[{"id": "470746a2-aa8e-4358-899e-db2856c6efa3", "address": "fa:16:3e:e2:7e:bb", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap470746a2-aa", "ovs_interfaceid": "470746a2-aa8e-4358-899e-db2856c6efa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1847.203259] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e2:7e:bb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '779b8e65-8b9e-427e-af08-910febd65bfa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '470746a2-aa8e-4358-899e-db2856c6efa3', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1847.210901] env[61439]: DEBUG oslo.service.loopingcall [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1847.211446] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1847.211677] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e496ac29-3532-43d0-8dac-2736738b449c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.232542] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1847.232542] env[61439]: value = "task-987789" [ 1847.232542] env[61439]: _type = "Task" [ 1847.232542] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1847.240998] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987789, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1847.743312] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987789, 'name': CreateVM_Task, 'duration_secs': 0.309868} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1847.743488] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1847.744157] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1847.744325] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1847.744674] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1847.744914] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-46dc17bc-e47a-4053-a09b-efe486885ac3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.749235] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1847.749235] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5220de85-1a9a-f54e-229e-3008c6362e4e" [ 1847.749235] env[61439]: _type = "Task" [ 1847.749235] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1847.756583] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5220de85-1a9a-f54e-229e-3008c6362e4e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1848.260363] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1848.260827] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1848.260901] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1848.261030] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1848.261196] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1848.261447] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-48460841-7818-4c4b-a5a5-ff5be9572e92 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.277743] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1848.277934] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1848.278600] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0957a9fb-83df-4fe6-a0a8-3983d947393e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.283400] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1848.283400] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5264c5fd-0010-4919-422a-7fe011a3bc8c" [ 1848.283400] env[61439]: _type = "Task" [ 1848.283400] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1848.290303] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5264c5fd-0010-4919-422a-7fe011a3bc8c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1848.793703] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1848.793963] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating directory with path [datastore2] vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1848.794211] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-877e0f07-dfa4-49c7-bf44-3869a4453274 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.813469] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Created directory with path [datastore2] vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1848.813684] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Fetch image to [datastore2] vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1848.813821] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1848.814593] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-468b96fe-43b9-40bd-994a-492a2d6894bd {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.822537] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78142179-607a-4ac2-8d56-54c9ad28fca1 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.831382] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df23159f-ff37-4e24-964e-74106d923085 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.862486] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25c85356-5cb4-4f3d-94f4-ef7b00a11703 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.866034] env[61439]: DEBUG nova.compute.manager [req-694b7a60-890b-483a-a8c2-0b9ccdb0121e req-f270c760-de55-4ff9-877e-6450c6951180 service nova] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Received event network-changed-470746a2-aa8e-4358-899e-db2856c6efa3 {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1848.866145] env[61439]: DEBUG nova.compute.manager [req-694b7a60-890b-483a-a8c2-0b9ccdb0121e req-f270c760-de55-4ff9-877e-6450c6951180 service nova] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Refreshing instance network info cache due to event network-changed-470746a2-aa8e-4358-899e-db2856c6efa3. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1848.866360] env[61439]: DEBUG oslo_concurrency.lockutils [req-694b7a60-890b-483a-a8c2-0b9ccdb0121e req-f270c760-de55-4ff9-877e-6450c6951180 service nova] Acquiring lock "refresh_cache-4cbfa0ff-1a92-4d02-9929-b05931142d19" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1848.866502] env[61439]: DEBUG oslo_concurrency.lockutils [req-694b7a60-890b-483a-a8c2-0b9ccdb0121e req-f270c760-de55-4ff9-877e-6450c6951180 service nova] Acquired lock "refresh_cache-4cbfa0ff-1a92-4d02-9929-b05931142d19" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1848.866679] env[61439]: DEBUG nova.network.neutron [req-694b7a60-890b-483a-a8c2-0b9ccdb0121e req-f270c760-de55-4ff9-877e-6450c6951180 service nova] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Refreshing network info cache for port 470746a2-aa8e-4358-899e-db2856c6efa3 {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1848.873023] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-87e04996-00df-4021-8213-ceca6e1da6c6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1848.900683] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1848.951357] env[61439]: DEBUG oslo_vmware.rw_handles [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1849.011320] env[61439]: DEBUG oslo_vmware.rw_handles [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1849.011516] env[61439]: DEBUG oslo_vmware.rw_handles [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1849.142931] env[61439]: DEBUG nova.network.neutron [req-694b7a60-890b-483a-a8c2-0b9ccdb0121e req-f270c760-de55-4ff9-877e-6450c6951180 service nova] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Updated VIF entry in instance network info cache for port 470746a2-aa8e-4358-899e-db2856c6efa3. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1849.143299] env[61439]: DEBUG nova.network.neutron [req-694b7a60-890b-483a-a8c2-0b9ccdb0121e req-f270c760-de55-4ff9-877e-6450c6951180 service nova] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Updating instance_info_cache with network_info: [{"id": "470746a2-aa8e-4358-899e-db2856c6efa3", "address": "fa:16:3e:e2:7e:bb", "network": {"id": "ae0ba33d-286b-46d2-b7e5-caea99f81aea", "bridge": "br-int", "label": "tempest-ServersTestJSON-1142892958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e839303682f748a4b5a42c8a9273e388", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "779b8e65-8b9e-427e-af08-910febd65bfa", "external-id": "nsx-vlan-transportzone-906", "segmentation_id": 906, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap470746a2-aa", "ovs_interfaceid": "470746a2-aa8e-4358-899e-db2856c6efa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1849.152452] env[61439]: DEBUG oslo_concurrency.lockutils [req-694b7a60-890b-483a-a8c2-0b9ccdb0121e req-f270c760-de55-4ff9-877e-6450c6951180 service nova] Releasing lock "refresh_cache-4cbfa0ff-1a92-4d02-9929-b05931142d19" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1865.202301] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1865.202840] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1865.213734] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1865.213938] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1865.214125] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1865.214293] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1865.215411] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aafc6428-906b-48ec-97ce-6483da37ab79 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1865.225086] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c101409-d9fe-4cdd-ab42-dc9be8f0a748 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1865.238529] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b56c2a9-d1cd-43e3-88c3-0fade5df015a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1865.244577] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90bc654f-9110-4693-9305-0d7c377d643e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1865.273039] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181567MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1865.273188] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1865.273383] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1865.358321] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 4cbfa0ff-1a92-4d02-9929-b05931142d19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1865.358553] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1865.358706] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1865.374621] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing inventories for resource provider b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1865.388023] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Updating ProviderTree inventory for provider b35c9fce-988b-4acc-b175-83b202107c41 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1865.388225] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Updating inventory in ProviderTree for provider b35c9fce-988b-4acc-b175-83b202107c41 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1865.399081] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing aggregate associations for resource provider b35c9fce-988b-4acc-b175-83b202107c41, aggregates: None {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1865.417183] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Refreshing trait associations for resource provider b35c9fce-988b-4acc-b175-83b202107c41, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE {{(pid=61439) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1865.442703] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2352ed18-c53d-4519-8987-d8e4739bdc46 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1865.449960] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6c2871a-c16f-448d-ba29-fd1eec89b586 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1865.480268] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3e1ce19-5bb4-44ce-b36a-495eac56eef6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1865.487348] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d5246e6-ea23-4e6c-a827-ab63ad950325 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1865.499771] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1865.507506] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1865.520791] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1865.520791] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1867.519967] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1867.519967] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1868.197803] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1868.201392] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1869.202186] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1869.202548] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1869.202548] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1869.212943] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1869.213117] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1870.202235] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1870.202643] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1872.202134] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1897.914719] env[61439]: WARNING oslo_vmware.rw_handles [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1897.914719] env[61439]: ERROR oslo_vmware.rw_handles [ 1897.915524] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1897.917245] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1897.917500] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Copying Virtual Disk [datastore2] vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/cf3649bf-e7b2-4b8e-825f-c82ba866d7ce/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1897.917787] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-57c245fa-6659-419a-aab8-adc67dcb08b0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.925536] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1897.925536] env[61439]: value = "task-987790" [ 1897.925536] env[61439]: _type = "Task" [ 1897.925536] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1897.933335] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987790, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1898.435385] env[61439]: DEBUG oslo_vmware.exceptions [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1898.435698] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1898.436261] env[61439]: ERROR nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1898.436261] env[61439]: Faults: ['InvalidArgument'] [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Traceback (most recent call last): [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] yield resources [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] self.driver.spawn(context, instance, image_meta, [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] self._fetch_image_if_missing(context, vi) [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] image_cache(vi, tmp_image_ds_loc) [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] vm_util.copy_virtual_disk( [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] session._wait_for_task(vmdk_copy_task) [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] return self.wait_for_task(task_ref) [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] return evt.wait() [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] result = hub.switch() [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] return self.greenlet.switch() [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] self.f(*self.args, **self.kw) [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] raise exceptions.translate_fault(task_info.error) [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Faults: ['InvalidArgument'] [ 1898.436261] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] [ 1898.437339] env[61439]: INFO nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Terminating instance [ 1898.439499] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1898.439687] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1898.440431] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-221ac9e7-cc94-456f-bb96-5d2425e4d867 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1898.446756] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1898.446974] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3a5cc8dc-c952-4c5a-ad60-b5a5462c871e {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1898.510886] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1898.511130] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1898.511338] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleting the datastore file [datastore2] 4cbfa0ff-1a92-4d02-9929-b05931142d19 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1898.511603] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2a7b1606-2a0d-4059-ae31-f5732b28bb91 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1898.518080] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Waiting for the task: (returnval){ [ 1898.518080] env[61439]: value = "task-987792" [ 1898.518080] env[61439]: _type = "Task" [ 1898.518080] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1898.525617] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987792, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1899.027513] env[61439]: DEBUG oslo_vmware.api [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Task: {'id': task-987792, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066083} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1899.027865] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1899.028014] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1899.028160] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1899.028332] env[61439]: INFO nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1899.030355] env[61439]: DEBUG nova.compute.claims [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1899.030563] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1899.030757] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1899.096941] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b735445-0622-4806-9da2-d0787148af14 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1899.105049] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e9aa8ce-fecb-4c96-aadb-463fefde8984 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1899.133509] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-471c3fb0-2a81-4327-926b-b4c95e2932dc {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1899.140106] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dad5ec22-c4c8-48d7-aa1b-fd31227e8565 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1899.152469] env[61439]: DEBUG nova.compute.provider_tree [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1899.160285] env[61439]: DEBUG nova.scheduler.client.report [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1899.171869] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.141s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1899.172380] env[61439]: ERROR nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1899.172380] env[61439]: Faults: ['InvalidArgument'] [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Traceback (most recent call last): [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] self.driver.spawn(context, instance, image_meta, [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] self._fetch_image_if_missing(context, vi) [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] image_cache(vi, tmp_image_ds_loc) [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] vm_util.copy_virtual_disk( [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] session._wait_for_task(vmdk_copy_task) [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] return self.wait_for_task(task_ref) [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] return evt.wait() [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] result = hub.switch() [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] return self.greenlet.switch() [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] self.f(*self.args, **self.kw) [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] raise exceptions.translate_fault(task_info.error) [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Faults: ['InvalidArgument'] [ 1899.172380] env[61439]: ERROR nova.compute.manager [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] [ 1899.173633] env[61439]: DEBUG nova.compute.utils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1899.174461] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Build of instance 4cbfa0ff-1a92-4d02-9929-b05931142d19 was re-scheduled: A specified parameter was not correct: fileType [ 1899.174461] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1899.174836] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1899.175016] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1899.175200] env[61439]: DEBUG nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1899.175363] env[61439]: DEBUG nova.network.neutron [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1899.458512] env[61439]: DEBUG nova.network.neutron [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1899.469290] env[61439]: INFO nova.compute.manager [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] [instance: 4cbfa0ff-1a92-4d02-9929-b05931142d19] Took 0.29 seconds to deallocate network for instance. [ 1899.559584] env[61439]: INFO nova.scheduler.client.report [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Deleted allocations for instance 4cbfa0ff-1a92-4d02-9929-b05931142d19 [ 1899.578394] env[61439]: DEBUG oslo_concurrency.lockutils [None req-d4896d1a-4e63-4ea3-a319-ae9ed9e0a795 tempest-ServersTestJSON-818634346 tempest-ServersTestJSON-818634346-project-member] Lock "4cbfa0ff-1a92-4d02-9929-b05931142d19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.913s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1905.955714] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "8eeb384b-b799-4b21-894f-c265e7e1b25e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1905.956423] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "8eeb384b-b799-4b21-894f-c265e7e1b25e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1905.968427] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1906.021400] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1906.021658] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1906.023213] env[61439]: INFO nova.compute.claims [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1906.096070] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cb1babb-d3c2-4772-a8a2-babbc4bb89f6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.104907] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87dc8733-1f78-47b7-bf40-3a3c4b941b44 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.135194] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34ecc361-c9e5-40ed-9c5b-45ea866c8874 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.142956] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7832d58-55ff-4ab6-a521-bcea7076f34a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.156696] env[61439]: DEBUG nova.compute.provider_tree [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1906.167271] env[61439]: DEBUG nova.scheduler.client.report [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1906.181767] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1906.182296] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1906.222109] env[61439]: DEBUG nova.compute.utils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1906.224283] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1906.224610] env[61439]: DEBUG nova.network.neutron [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1906.235261] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1906.283516] env[61439]: DEBUG nova.policy [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '07c678a871574174b2bfefac2f989bb9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '92ed23cb43494196a6a832149a981bb0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1906.304197] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1906.332171] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1906.332424] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1906.332584] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1906.332832] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1906.333009] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1906.333166] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1906.333381] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1906.333542] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1906.333710] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1906.333892] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1906.334107] env[61439]: DEBUG nova.virt.hardware [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1906.335214] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c79ba584-3a83-4699-a076-7dd618929f8d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.344546] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73b36c40-32ea-4196-ae9b-02823f0bd8d0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.631614] env[61439]: DEBUG nova.network.neutron [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Successfully created port: a30d0236-0277-46fd-bd4b-2192afe0b7be {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1907.186321] env[61439]: DEBUG nova.compute.manager [req-12e73315-e807-497b-ac1e-28726d1b73ed req-25566b8f-78fb-462c-b191-90a4f500cb99 service nova] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Received event network-vif-plugged-a30d0236-0277-46fd-bd4b-2192afe0b7be {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1907.186580] env[61439]: DEBUG oslo_concurrency.lockutils [req-12e73315-e807-497b-ac1e-28726d1b73ed req-25566b8f-78fb-462c-b191-90a4f500cb99 service nova] Acquiring lock "8eeb384b-b799-4b21-894f-c265e7e1b25e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1907.186755] env[61439]: DEBUG oslo_concurrency.lockutils [req-12e73315-e807-497b-ac1e-28726d1b73ed req-25566b8f-78fb-462c-b191-90a4f500cb99 service nova] Lock "8eeb384b-b799-4b21-894f-c265e7e1b25e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1907.186928] env[61439]: DEBUG oslo_concurrency.lockutils [req-12e73315-e807-497b-ac1e-28726d1b73ed req-25566b8f-78fb-462c-b191-90a4f500cb99 service nova] Lock "8eeb384b-b799-4b21-894f-c265e7e1b25e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1907.187110] env[61439]: DEBUG nova.compute.manager [req-12e73315-e807-497b-ac1e-28726d1b73ed req-25566b8f-78fb-462c-b191-90a4f500cb99 service nova] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] No waiting events found dispatching network-vif-plugged-a30d0236-0277-46fd-bd4b-2192afe0b7be {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1907.187283] env[61439]: WARNING nova.compute.manager [req-12e73315-e807-497b-ac1e-28726d1b73ed req-25566b8f-78fb-462c-b191-90a4f500cb99 service nova] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Received unexpected event network-vif-plugged-a30d0236-0277-46fd-bd4b-2192afe0b7be for instance with vm_state building and task_state spawning. [ 1907.273702] env[61439]: DEBUG nova.network.neutron [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Successfully updated port: a30d0236-0277-46fd-bd4b-2192afe0b7be {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1907.287209] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "refresh_cache-8eeb384b-b799-4b21-894f-c265e7e1b25e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1907.287358] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquired lock "refresh_cache-8eeb384b-b799-4b21-894f-c265e7e1b25e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1907.287511] env[61439]: DEBUG nova.network.neutron [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1907.325895] env[61439]: DEBUG nova.network.neutron [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1907.495642] env[61439]: DEBUG nova.network.neutron [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Updating instance_info_cache with network_info: [{"id": "a30d0236-0277-46fd-bd4b-2192afe0b7be", "address": "fa:16:3e:ff:d0:65", "network": {"id": "668b80a6-37be-4f9b-8200-f3ef1f09c515", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-346638025-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92ed23cb43494196a6a832149a981bb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2bf99f85-3a5c-47c6-a603-e215be6ab0bd", "external-id": "nsx-vlan-transportzone-855", "segmentation_id": 855, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa30d0236-02", "ovs_interfaceid": "a30d0236-0277-46fd-bd4b-2192afe0b7be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1907.506631] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Releasing lock "refresh_cache-8eeb384b-b799-4b21-894f-c265e7e1b25e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1907.506954] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Instance network_info: |[{"id": "a30d0236-0277-46fd-bd4b-2192afe0b7be", "address": "fa:16:3e:ff:d0:65", "network": {"id": "668b80a6-37be-4f9b-8200-f3ef1f09c515", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-346638025-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92ed23cb43494196a6a832149a981bb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2bf99f85-3a5c-47c6-a603-e215be6ab0bd", "external-id": "nsx-vlan-transportzone-855", "segmentation_id": 855, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa30d0236-02", "ovs_interfaceid": "a30d0236-0277-46fd-bd4b-2192afe0b7be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1907.507357] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ff:d0:65', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2bf99f85-3a5c-47c6-a603-e215be6ab0bd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a30d0236-0277-46fd-bd4b-2192afe0b7be', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1907.514946] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Creating folder: Project (92ed23cb43494196a6a832149a981bb0). Parent ref: group-v221281. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1907.515498] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-08f6b619-f765-4b24-8e35-c065cbe635a6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1907.528205] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Created folder: Project (92ed23cb43494196a6a832149a981bb0) in parent group-v221281. [ 1907.528265] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Creating folder: Instances. Parent ref: group-v221347. {{(pid=61439) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1907.528585] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-87c91ae9-f105-49be-8433-8f8e9280f381 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1907.538806] env[61439]: INFO nova.virt.vmwareapi.vm_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Created folder: Instances in parent group-v221347. [ 1907.539054] env[61439]: DEBUG oslo.service.loopingcall [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1907.539241] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1907.539437] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-430e9dfc-9c21-4457-a673-3fa1b62216b4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1907.558306] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1907.558306] env[61439]: value = "task-987795" [ 1907.558306] env[61439]: _type = "Task" [ 1907.558306] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1907.565958] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987795, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1908.068308] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987795, 'name': CreateVM_Task, 'duration_secs': 0.426058} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1908.068508] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1908.069223] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1908.069437] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1908.069814] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1908.070236] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9094578d-942d-4b90-8691-b633a084ea6f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1908.074775] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for the task: (returnval){ [ 1908.074775] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52019a20-650f-604c-595c-bd224a728d19" [ 1908.074775] env[61439]: _type = "Task" [ 1908.074775] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1908.082141] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52019a20-650f-604c-595c-bd224a728d19, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1908.585308] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1908.585670] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1908.585792] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1908.585944] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1908.586131] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1908.586362] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a64f1296-4164-45c9-998e-86b5c0da2aa3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1908.602528] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1908.602701] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1908.603528] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e98ccb13-cab6-47e3-9bee-f7a79c1768e9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1908.608266] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for the task: (returnval){ [ 1908.608266] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]528184e9-0987-923e-b513-e677c7da0976" [ 1908.608266] env[61439]: _type = "Task" [ 1908.608266] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1908.616780] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]528184e9-0987-923e-b513-e677c7da0976, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1909.118707] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1909.118961] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Creating directory with path [datastore2] vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1909.119208] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-70ca9692-5b3c-48e1-9597-273160f93e0f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.138024] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Created directory with path [datastore2] vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1909.138219] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Fetch image to [datastore2] vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1909.138388] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1909.139099] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dd550fe-5b1e-4ded-b810-66826a13afe3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.145678] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82df70d5-f177-400d-be49-06f180f56203 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.154284] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1227ca78-8c7c-4b8c-ac36-51da67c088e0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.183924] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b8bc147-c750-4bb1-80eb-440091e21349 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.189056] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-29f8aa1a-bd6c-4b22-afce-bcbe301e5e48 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1909.211568] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1909.219065] env[61439]: DEBUG nova.compute.manager [req-1dfe8278-792a-48f2-96fd-d200995c6ddc req-0910c812-8a71-4126-b499-347475d29aea service nova] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Received event network-changed-a30d0236-0277-46fd-bd4b-2192afe0b7be {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1909.219284] env[61439]: DEBUG nova.compute.manager [req-1dfe8278-792a-48f2-96fd-d200995c6ddc req-0910c812-8a71-4126-b499-347475d29aea service nova] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Refreshing instance network info cache due to event network-changed-a30d0236-0277-46fd-bd4b-2192afe0b7be. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1909.219536] env[61439]: DEBUG oslo_concurrency.lockutils [req-1dfe8278-792a-48f2-96fd-d200995c6ddc req-0910c812-8a71-4126-b499-347475d29aea service nova] Acquiring lock "refresh_cache-8eeb384b-b799-4b21-894f-c265e7e1b25e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1909.219638] env[61439]: DEBUG oslo_concurrency.lockutils [req-1dfe8278-792a-48f2-96fd-d200995c6ddc req-0910c812-8a71-4126-b499-347475d29aea service nova] Acquired lock "refresh_cache-8eeb384b-b799-4b21-894f-c265e7e1b25e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1909.220109] env[61439]: DEBUG nova.network.neutron [req-1dfe8278-792a-48f2-96fd-d200995c6ddc req-0910c812-8a71-4126-b499-347475d29aea service nova] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Refreshing network info cache for port a30d0236-0277-46fd-bd4b-2192afe0b7be {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1909.260280] env[61439]: DEBUG oslo_vmware.rw_handles [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1909.318506] env[61439]: DEBUG oslo_vmware.rw_handles [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1909.318686] env[61439]: DEBUG oslo_vmware.rw_handles [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1909.468063] env[61439]: DEBUG nova.network.neutron [req-1dfe8278-792a-48f2-96fd-d200995c6ddc req-0910c812-8a71-4126-b499-347475d29aea service nova] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Updated VIF entry in instance network info cache for port a30d0236-0277-46fd-bd4b-2192afe0b7be. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1909.468415] env[61439]: DEBUG nova.network.neutron [req-1dfe8278-792a-48f2-96fd-d200995c6ddc req-0910c812-8a71-4126-b499-347475d29aea service nova] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Updating instance_info_cache with network_info: [{"id": "a30d0236-0277-46fd-bd4b-2192afe0b7be", "address": "fa:16:3e:ff:d0:65", "network": {"id": "668b80a6-37be-4f9b-8200-f3ef1f09c515", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-346638025-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92ed23cb43494196a6a832149a981bb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2bf99f85-3a5c-47c6-a603-e215be6ab0bd", "external-id": "nsx-vlan-transportzone-855", "segmentation_id": 855, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa30d0236-02", "ovs_interfaceid": "a30d0236-0277-46fd-bd4b-2192afe0b7be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1909.477509] env[61439]: DEBUG oslo_concurrency.lockutils [req-1dfe8278-792a-48f2-96fd-d200995c6ddc req-0910c812-8a71-4126-b499-347475d29aea service nova] Releasing lock "refresh_cache-8eeb384b-b799-4b21-894f-c265e7e1b25e" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1925.202065] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1925.202473] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1925.213745] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1925.213972] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1925.214154] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1925.214316] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1925.215423] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5933762-8813-4fbc-9b70-3903ae1920c6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.223898] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35a6be04-e8a7-4899-a741-5f8c39a65678 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.237482] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9b48855-1374-442f-9776-835c419ff904 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.243495] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26beb72e-ba28-48f9-ad82-1900660bfd7f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.271548] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181588MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1925.271683] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1925.271872] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1925.342242] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 8eeb384b-b799-4b21-894f-c265e7e1b25e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1925.342458] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1925.342611] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1925.375610] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73b9e37d-ffea-4fa4-ba5a-c88f87b22b0d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.383360] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83694662-975e-4be9-a9c5-c31c16f71e5f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.415472] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7ec1275-a6a9-4127-a2aa-26671bc3c213 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.422521] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0bb635b-4f2b-46d3-bfbd-494454d393ea {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.435137] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1925.443060] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1925.458325] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1925.458466] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1928.458984] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1929.201687] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1929.201920] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1930.197207] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1931.201509] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1931.201811] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1931.201811] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1931.211384] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1931.211530] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1931.211735] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1932.201538] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1932.201881] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1936.198078] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1956.802268] env[61439]: WARNING oslo_vmware.rw_handles [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1956.802268] env[61439]: ERROR oslo_vmware.rw_handles [ 1956.803096] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1956.805116] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1956.805383] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Copying Virtual Disk [datastore2] vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/e979146c-d1e1-44c7-89d0-5560ac17649f/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1956.805796] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-83546adf-1a56-496b-bc5a-6cf0cd5c0f6f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1956.813037] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for the task: (returnval){ [ 1956.813037] env[61439]: value = "task-987796" [ 1956.813037] env[61439]: _type = "Task" [ 1956.813037] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1956.820811] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': task-987796, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1957.323761] env[61439]: DEBUG oslo_vmware.exceptions [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1957.324079] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1957.324642] env[61439]: ERROR nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1957.324642] env[61439]: Faults: ['InvalidArgument'] [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Traceback (most recent call last): [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] yield resources [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] self.driver.spawn(context, instance, image_meta, [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] self._fetch_image_if_missing(context, vi) [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] image_cache(vi, tmp_image_ds_loc) [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] vm_util.copy_virtual_disk( [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] session._wait_for_task(vmdk_copy_task) [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] return self.wait_for_task(task_ref) [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] return evt.wait() [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] result = hub.switch() [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] return self.greenlet.switch() [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] self.f(*self.args, **self.kw) [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] raise exceptions.translate_fault(task_info.error) [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Faults: ['InvalidArgument'] [ 1957.324642] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] [ 1957.326046] env[61439]: INFO nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Terminating instance [ 1957.327951] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1957.328163] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1957.328896] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52c88b86-222a-4b56-8e7c-96bce9834879 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.336613] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1957.336829] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-67e5d25e-01fb-47bb-9d1e-bd60105e0392 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.245686] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1958.245686] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1958.245686] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Deleting the datastore file [datastore2] 8eeb384b-b799-4b21-894f-c265e7e1b25e {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1958.246205] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e9bc3c1b-5008-43a2-9321-269b3d1c0eaa {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.251958] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for the task: (returnval){ [ 1958.251958] env[61439]: value = "task-987798" [ 1958.251958] env[61439]: _type = "Task" [ 1958.251958] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1958.259286] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': task-987798, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1958.761485] env[61439]: DEBUG oslo_vmware.api [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': task-987798, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071278} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1958.761831] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1958.762041] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1958.762223] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1958.762399] env[61439]: INFO nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Took 1.43 seconds to destroy the instance on the hypervisor. [ 1958.764454] env[61439]: DEBUG nova.compute.claims [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1958.764626] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1958.764844] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1958.827200] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a2328cb-c800-4b3a-a1b6-7792df2f9bcb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.834455] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3a5cfe-02bb-4e1d-8c68-77cfdf849791 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.863461] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73f866c3-348c-4976-be60-0e811426bbae {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.870324] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dfc7073-f9f1-4f52-bb62-d610b954bd8a {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1958.883237] env[61439]: DEBUG nova.compute.provider_tree [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1958.890963] env[61439]: DEBUG nova.scheduler.client.report [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1958.902904] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.138s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1958.903433] env[61439]: ERROR nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1958.903433] env[61439]: Faults: ['InvalidArgument'] [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Traceback (most recent call last): [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] self.driver.spawn(context, instance, image_meta, [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] self._fetch_image_if_missing(context, vi) [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] image_cache(vi, tmp_image_ds_loc) [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] vm_util.copy_virtual_disk( [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] session._wait_for_task(vmdk_copy_task) [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] return self.wait_for_task(task_ref) [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] return evt.wait() [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] result = hub.switch() [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] return self.greenlet.switch() [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] self.f(*self.args, **self.kw) [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] raise exceptions.translate_fault(task_info.error) [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Faults: ['InvalidArgument'] [ 1958.903433] env[61439]: ERROR nova.compute.manager [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] [ 1958.904572] env[61439]: DEBUG nova.compute.utils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1958.905515] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Build of instance 8eeb384b-b799-4b21-894f-c265e7e1b25e was re-scheduled: A specified parameter was not correct: fileType [ 1958.905515] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1958.905880] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1958.906069] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1958.906246] env[61439]: DEBUG nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1958.906424] env[61439]: DEBUG nova.network.neutron [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1959.201611] env[61439]: DEBUG nova.network.neutron [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1959.213280] env[61439]: INFO nova.compute.manager [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 8eeb384b-b799-4b21-894f-c265e7e1b25e] Took 0.31 seconds to deallocate network for instance. [ 1959.301269] env[61439]: INFO nova.scheduler.client.report [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Deleted allocations for instance 8eeb384b-b799-4b21-894f-c265e7e1b25e [ 1959.320151] env[61439]: DEBUG oslo_concurrency.lockutils [None req-c8789fcf-e984-4061-bffe-d4eac82fc2f4 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "8eeb384b-b799-4b21-894f-c265e7e1b25e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.364s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1961.946517] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "9cc4b1c2-2582-4881-9572-0f261464eac3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1961.946847] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "9cc4b1c2-2582-4881-9572-0f261464eac3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1961.957746] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Starting instance... {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1962.003037] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1962.003226] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1962.004708] env[61439]: INFO nova.compute.claims [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1962.071747] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb8491b6-2fc5-4686-9437-c6701a953e8f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1962.080018] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be737823-41cf-4f66-81e5-a867bb9761d5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1962.110025] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5912105-f4a7-4506-b4a1-2bb7af98e1de {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1962.118228] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f66e620a-8dbe-4c31-8954-8ecb04343087 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1962.786080] env[61439]: DEBUG nova.compute.provider_tree [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1962.794337] env[61439]: DEBUG nova.scheduler.client.report [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1962.806739] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.803s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1962.807247] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Start building networks asynchronously for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1962.840105] env[61439]: DEBUG nova.compute.utils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Using /dev/sd instead of None {{(pid=61439) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1962.840443] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Allocating IP information in the background. {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1962.840615] env[61439]: DEBUG nova.network.neutron [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] allocate_for_instance() {{(pid=61439) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1962.849200] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Start building block device mappings for instance. {{(pid=61439) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1962.902974] env[61439]: DEBUG nova.policy [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '07c678a871574174b2bfefac2f989bb9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '92ed23cb43494196a6a832149a981bb0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=61439) authorize /opt/stack/nova/nova/policy.py:203}} [ 1962.907743] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Start spawning the instance on the hypervisor. {{(pid=61439) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1962.930991] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-20T17:03:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-20T17:02:47Z,direct_url=,disk_format='vmdk',id=a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d8f76046251a4a44a275999df0a57832',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-20T17:02:48Z,virtual_size=,visibility=), allow threads: False {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1962.931242] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Flavor limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1962.931400] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Image limits 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1962.931579] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Flavor pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1962.931723] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Image pref 0:0:0 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1962.931868] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=61439) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1962.932091] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1962.932256] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1962.932423] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Got 1 possible topologies {{(pid=61439) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1962.932584] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1962.932775] env[61439]: DEBUG nova.virt.hardware [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=61439) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1962.933646] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c44ef0f0-c6c6-4b4e-b187-769f62d0257c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1962.943036] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a03f61f4-bae9-4d79-a5e5-5c598f1b9e59 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1963.208391] env[61439]: DEBUG nova.network.neutron [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Successfully created port: b3796c9a-1913-4435-95dd-35b9a56256de {{(pid=61439) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1963.673540] env[61439]: DEBUG nova.compute.manager [req-0aa56dec-9d56-46a5-b494-89e0629c124f req-dd14b61a-3cc4-476b-be5d-afc2e7e56812 service nova] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Received event network-vif-plugged-b3796c9a-1913-4435-95dd-35b9a56256de {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1963.673765] env[61439]: DEBUG oslo_concurrency.lockutils [req-0aa56dec-9d56-46a5-b494-89e0629c124f req-dd14b61a-3cc4-476b-be5d-afc2e7e56812 service nova] Acquiring lock "9cc4b1c2-2582-4881-9572-0f261464eac3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1963.673992] env[61439]: DEBUG oslo_concurrency.lockutils [req-0aa56dec-9d56-46a5-b494-89e0629c124f req-dd14b61a-3cc4-476b-be5d-afc2e7e56812 service nova] Lock "9cc4b1c2-2582-4881-9572-0f261464eac3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1963.674178] env[61439]: DEBUG oslo_concurrency.lockutils [req-0aa56dec-9d56-46a5-b494-89e0629c124f req-dd14b61a-3cc4-476b-be5d-afc2e7e56812 service nova] Lock "9cc4b1c2-2582-4881-9572-0f261464eac3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1963.674342] env[61439]: DEBUG nova.compute.manager [req-0aa56dec-9d56-46a5-b494-89e0629c124f req-dd14b61a-3cc4-476b-be5d-afc2e7e56812 service nova] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] No waiting events found dispatching network-vif-plugged-b3796c9a-1913-4435-95dd-35b9a56256de {{(pid=61439) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1963.674506] env[61439]: WARNING nova.compute.manager [req-0aa56dec-9d56-46a5-b494-89e0629c124f req-dd14b61a-3cc4-476b-be5d-afc2e7e56812 service nova] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Received unexpected event network-vif-plugged-b3796c9a-1913-4435-95dd-35b9a56256de for instance with vm_state building and task_state spawning. [ 1963.750831] env[61439]: DEBUG nova.network.neutron [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Successfully updated port: b3796c9a-1913-4435-95dd-35b9a56256de {{(pid=61439) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1963.761697] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "refresh_cache-9cc4b1c2-2582-4881-9572-0f261464eac3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1963.761697] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquired lock "refresh_cache-9cc4b1c2-2582-4881-9572-0f261464eac3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1963.761884] env[61439]: DEBUG nova.network.neutron [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Building network info cache for instance {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1963.802692] env[61439]: DEBUG nova.network.neutron [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Instance cache missing network info. {{(pid=61439) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1964.002290] env[61439]: DEBUG nova.network.neutron [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Updating instance_info_cache with network_info: [{"id": "b3796c9a-1913-4435-95dd-35b9a56256de", "address": "fa:16:3e:1d:8a:8b", "network": {"id": "668b80a6-37be-4f9b-8200-f3ef1f09c515", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-346638025-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92ed23cb43494196a6a832149a981bb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2bf99f85-3a5c-47c6-a603-e215be6ab0bd", "external-id": "nsx-vlan-transportzone-855", "segmentation_id": 855, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3796c9a-19", "ovs_interfaceid": "b3796c9a-1913-4435-95dd-35b9a56256de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1964.012340] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Releasing lock "refresh_cache-9cc4b1c2-2582-4881-9572-0f261464eac3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1964.012624] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Instance network_info: |[{"id": "b3796c9a-1913-4435-95dd-35b9a56256de", "address": "fa:16:3e:1d:8a:8b", "network": {"id": "668b80a6-37be-4f9b-8200-f3ef1f09c515", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-346638025-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92ed23cb43494196a6a832149a981bb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2bf99f85-3a5c-47c6-a603-e215be6ab0bd", "external-id": "nsx-vlan-transportzone-855", "segmentation_id": 855, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3796c9a-19", "ovs_interfaceid": "b3796c9a-1913-4435-95dd-35b9a56256de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=61439) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1964.013041] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1d:8a:8b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2bf99f85-3a5c-47c6-a603-e215be6ab0bd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b3796c9a-1913-4435-95dd-35b9a56256de', 'vif_model': 'vmxnet3'}] {{(pid=61439) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1964.020443] env[61439]: DEBUG oslo.service.loopingcall [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=61439) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1964.020895] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Creating VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1964.021150] env[61439]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7e5edcde-7bd1-4238-8884-9e0f72fc2de6 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.042313] env[61439]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1964.042313] env[61439]: value = "task-987799" [ 1964.042313] env[61439]: _type = "Task" [ 1964.042313] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1964.049935] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987799, 'name': CreateVM_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1964.552347] env[61439]: DEBUG oslo_vmware.api [-] Task: {'id': task-987799, 'name': CreateVM_Task, 'duration_secs': 0.290384} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1964.552696] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Created VM on the ESX host {{(pid=61439) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1964.553206] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1964.553372] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1964.553684] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1964.553952] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d32c27d9-0600-496d-8ba7-c3f2df3620c7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.558216] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for the task: (returnval){ [ 1964.558216] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5228091f-654e-6ab7-3e4e-9f5601e392ca" [ 1964.558216] env[61439]: _type = "Task" [ 1964.558216] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1964.566382] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]5228091f-654e-6ab7-3e4e-9f5601e392ca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1965.068882] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1965.069154] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Processing image a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1965.069390] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1965.069538] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1965.069714] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1965.069959] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2dc0bd75-7d1a-4969-95bb-4a529c6004b0 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.077035] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1965.077215] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=61439) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1965.077869] env[61439]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-029621c3-f8e8-4cbb-a008-60d2b7504ea5 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.082344] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for the task: (returnval){ [ 1965.082344] env[61439]: value = "session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52b20f75-9247-7af3-43c8-172acefb42b1" [ 1965.082344] env[61439]: _type = "Task" [ 1965.082344] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1965.092775] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': session[52e2a42a-6bbc-6717-b0bf-8a924a68c128]52b20f75-9247-7af3-43c8-172acefb42b1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1965.592952] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Preparing fetch location {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1965.593264] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Creating directory with path [datastore2] vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1965.593356] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-332d3854-1913-4ee3-93e3-fc057ef5486c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.611972] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Created directory with path [datastore2] vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 {{(pid=61439) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1965.612177] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Fetch image to [datastore2] vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1965.612347] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to [datastore2] vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1965.613057] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efbae71c-e5dd-4cf8-9e73-7efb79833161 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.619417] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a8a3c89-e9de-4c31-bb53-36431dd5c97d {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.627959] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-341e62fd-7227-4053-ad0e-c0adcaaec202 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.657229] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d30edd3-bcc2-427d-9409-eb687c9cddc3 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.662348] env[61439]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-12240335-7acc-43b6-aebe-c8bf9d2f6bb9 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.683016] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Downloading image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1965.706615] env[61439]: DEBUG nova.compute.manager [req-abde6357-60df-4cad-9473-51c130af8724 req-7638b09e-0a06-4ddc-80cd-80f948b09262 service nova] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Received event network-changed-b3796c9a-1913-4435-95dd-35b9a56256de {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1965.706815] env[61439]: DEBUG nova.compute.manager [req-abde6357-60df-4cad-9473-51c130af8724 req-7638b09e-0a06-4ddc-80cd-80f948b09262 service nova] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Refreshing instance network info cache due to event network-changed-b3796c9a-1913-4435-95dd-35b9a56256de. {{(pid=61439) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1965.707178] env[61439]: DEBUG oslo_concurrency.lockutils [req-abde6357-60df-4cad-9473-51c130af8724 req-7638b09e-0a06-4ddc-80cd-80f948b09262 service nova] Acquiring lock "refresh_cache-9cc4b1c2-2582-4881-9572-0f261464eac3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1965.707371] env[61439]: DEBUG oslo_concurrency.lockutils [req-abde6357-60df-4cad-9473-51c130af8724 req-7638b09e-0a06-4ddc-80cd-80f948b09262 service nova] Acquired lock "refresh_cache-9cc4b1c2-2582-4881-9572-0f261464eac3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1965.707541] env[61439]: DEBUG nova.network.neutron [req-abde6357-60df-4cad-9473-51c130af8724 req-7638b09e-0a06-4ddc-80cd-80f948b09262 service nova] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Refreshing network info cache for port b3796c9a-1913-4435-95dd-35b9a56256de {{(pid=61439) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1965.739335] env[61439]: DEBUG oslo_vmware.rw_handles [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1965.798324] env[61439]: DEBUG oslo_vmware.rw_handles [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Completed reading data from the image iterator. {{(pid=61439) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1965.798510] env[61439]: DEBUG oslo_vmware.rw_handles [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=61439) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1965.991017] env[61439]: DEBUG nova.network.neutron [req-abde6357-60df-4cad-9473-51c130af8724 req-7638b09e-0a06-4ddc-80cd-80f948b09262 service nova] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Updated VIF entry in instance network info cache for port b3796c9a-1913-4435-95dd-35b9a56256de. {{(pid=61439) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1965.991386] env[61439]: DEBUG nova.network.neutron [req-abde6357-60df-4cad-9473-51c130af8724 req-7638b09e-0a06-4ddc-80cd-80f948b09262 service nova] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Updating instance_info_cache with network_info: [{"id": "b3796c9a-1913-4435-95dd-35b9a56256de", "address": "fa:16:3e:1d:8a:8b", "network": {"id": "668b80a6-37be-4f9b-8200-f3ef1f09c515", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-346638025-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92ed23cb43494196a6a832149a981bb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2bf99f85-3a5c-47c6-a603-e215be6ab0bd", "external-id": "nsx-vlan-transportzone-855", "segmentation_id": 855, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3796c9a-19", "ovs_interfaceid": "b3796c9a-1913-4435-95dd-35b9a56256de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1966.000458] env[61439]: DEBUG oslo_concurrency.lockutils [req-abde6357-60df-4cad-9473-51c130af8724 req-7638b09e-0a06-4ddc-80cd-80f948b09262 service nova] Releasing lock "refresh_cache-9cc4b1c2-2582-4881-9572-0f261464eac3" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1986.202221] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager.update_available_resource {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1986.215164] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1986.215401] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1986.215569] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1986.215725] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=61439) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1986.216862] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d49f078d-1c30-4b55-952c-19690c7c4deb {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.226171] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32b25587-9af5-4a6f-ae72-16919ecf3381 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.239840] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f16d0d5-1e52-4adb-88be-35650ee22f56 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.246155] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47a4d778-b901-4cf4-ae56-e0c3327ec026 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.274815] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181585MB free_disk=35GB free_vcpus=48 pci_devices=None {{(pid=61439) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1986.274978] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1986.275172] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1986.317602] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Instance 9cc4b1c2-2582-4881-9572-0f261464eac3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=61439) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1986.317813] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1986.317965] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=61439) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1986.344811] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-404516f9-c153-4d65-811c-e3fad87c6b0f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.352216] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5ffada0-cb1a-4b68-89ae-9b968e864277 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.382240] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f445b4ee-958c-4bb0-a55e-990c91aa14c4 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.389282] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6776caef-f517-4bf9-adef-a7075db75f5c {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1986.401892] env[61439]: DEBUG nova.compute.provider_tree [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1986.409486] env[61439]: DEBUG nova.scheduler.client.report [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1986.421462] env[61439]: DEBUG nova.compute.resource_tracker [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=61439) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1986.421638] env[61439]: DEBUG oslo_concurrency.lockutils [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1987.421995] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1988.203088] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1990.203016] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1990.203433] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=61439) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1992.198058] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1992.201690] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1992.201858] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Starting heal instance info cache {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1992.202023] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Rebuilding the list of instances to heal {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1992.212424] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Skipping network cache update for instance because it is Building. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1992.212575] env[61439]: DEBUG nova.compute.manager [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Didn't find any instances for network info cache update. {{(pid=61439) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1992.212794] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1992.213322] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1994.202061] env[61439]: DEBUG oslo_service.periodic_task [None req-b7e61c50-8186-4e48-8662-9666f38a733b None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=61439) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2013.116026] env[61439]: WARNING oslo_vmware.rw_handles [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles response.begin() [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2013.116026] env[61439]: ERROR oslo_vmware.rw_handles [ 2013.116825] env[61439]: DEBUG nova.virt.vmwareapi.images [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Downloaded image file data a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22 to vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk on the data store datastore2 {{(pid=61439) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2013.118507] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Caching image {{(pid=61439) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2013.118754] env[61439]: DEBUG nova.virt.vmwareapi.vm_util [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Copying Virtual Disk [datastore2] vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/tmp-sparse.vmdk to [datastore2] vmware_temp/d6449351-9fc2-4017-be3e-39eba24b2abe/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk {{(pid=61439) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2013.119067] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ce57175e-1f0d-47e7-9bb4-8c0f77dd3c27 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.127368] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for the task: (returnval){ [ 2013.127368] env[61439]: value = "task-987800" [ 2013.127368] env[61439]: _type = "Task" [ 2013.127368] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2013.134841] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': task-987800, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2013.637273] env[61439]: DEBUG oslo_vmware.exceptions [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Fault InvalidArgument not matched. {{(pid=61439) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2013.637557] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22/a8b5e8e9-a03f-44eb-a8f4-56b1df9fce22.vmdk" {{(pid=61439) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2013.638118] env[61439]: ERROR nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2013.638118] env[61439]: Faults: ['InvalidArgument'] [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Traceback (most recent call last): [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] yield resources [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] self.driver.spawn(context, instance, image_meta, [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] self._fetch_image_if_missing(context, vi) [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] image_cache(vi, tmp_image_ds_loc) [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] vm_util.copy_virtual_disk( [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] session._wait_for_task(vmdk_copy_task) [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] return self.wait_for_task(task_ref) [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] return evt.wait() [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] result = hub.switch() [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] return self.greenlet.switch() [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] self.f(*self.args, **self.kw) [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] raise exceptions.translate_fault(task_info.error) [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Faults: ['InvalidArgument'] [ 2013.638118] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] [ 2013.639190] env[61439]: INFO nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Terminating instance [ 2013.641519] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Start destroying the instance on the hypervisor. {{(pid=61439) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2013.641710] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Destroying instance {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2013.642439] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b6bbb6d-3a4a-4f29-bda3-8cc574cd7179 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.650084] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Unregistering the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2013.650298] env[61439]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f3da73c6-6e27-497a-ba8b-8c8cd67d0d2f {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.714300] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Unregistered the VM {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2013.714560] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Deleting contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2013.714704] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Deleting the datastore file [datastore2] 9cc4b1c2-2582-4881-9572-0f261464eac3 {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2013.714975] env[61439]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b7ed6fd3-daf7-47aa-ac18-87316b1068f7 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.721985] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Waiting for the task: (returnval){ [ 2013.721985] env[61439]: value = "task-987802" [ 2013.721985] env[61439]: _type = "Task" [ 2013.721985] env[61439]: } to complete. {{(pid=61439) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2013.729176] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': task-987802, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2014.231484] env[61439]: DEBUG oslo_vmware.api [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Task: {'id': task-987802, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064645} completed successfully. {{(pid=61439) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2014.231931] env[61439]: DEBUG nova.virt.vmwareapi.ds_util [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Deleted the datastore file {{(pid=61439) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2014.231931] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Deleted contents of the VM from datastore datastore2 {{(pid=61439) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2014.232083] env[61439]: DEBUG nova.virt.vmwareapi.vmops [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Instance destroyed {{(pid=61439) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2014.232191] env[61439]: INFO nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Took 0.59 seconds to destroy the instance on the hypervisor. [ 2014.234419] env[61439]: DEBUG nova.compute.claims [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Aborting claim: {{(pid=61439) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2014.234595] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2014.234821] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2014.304977] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f59271f5-e0c1-4332-86fa-d36bdf09f5af {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.312456] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20d127cd-c651-4b97-9dec-fa66d32b5628 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.341486] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58447667-429a-45c9-b74d-74a5dd514c42 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.349493] env[61439]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68113709-6b52-4ec7-b2d9-4f5222700c70 {{(pid=61439) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.362867] env[61439]: DEBUG nova.compute.provider_tree [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Inventory has not changed in ProviderTree for provider: b35c9fce-988b-4acc-b175-83b202107c41 {{(pid=61439) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2014.371248] env[61439]: DEBUG nova.scheduler.client.report [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Inventory has not changed for provider b35c9fce-988b-4acc-b175-83b202107c41 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 35, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=61439) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2014.385469] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.151s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2014.385985] env[61439]: ERROR nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2014.385985] env[61439]: Faults: ['InvalidArgument'] [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Traceback (most recent call last): [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] self.driver.spawn(context, instance, image_meta, [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] self._fetch_image_if_missing(context, vi) [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] image_cache(vi, tmp_image_ds_loc) [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] vm_util.copy_virtual_disk( [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] session._wait_for_task(vmdk_copy_task) [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] return self.wait_for_task(task_ref) [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] return evt.wait() [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] result = hub.switch() [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] return self.greenlet.switch() [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] self.f(*self.args, **self.kw) [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] raise exceptions.translate_fault(task_info.error) [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Faults: ['InvalidArgument'] [ 2014.385985] env[61439]: ERROR nova.compute.manager [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] [ 2014.386838] env[61439]: DEBUG nova.compute.utils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] VimFaultException {{(pid=61439) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2014.388398] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Build of instance 9cc4b1c2-2582-4881-9572-0f261464eac3 was re-scheduled: A specified parameter was not correct: fileType [ 2014.388398] env[61439]: Faults: ['InvalidArgument'] {{(pid=61439) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2014.388782] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Unplugging VIFs for instance {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2014.388954] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=61439) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2014.389144] env[61439]: DEBUG nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Deallocating network for instance {{(pid=61439) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2014.389310] env[61439]: DEBUG nova.network.neutron [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] deallocate_for_instance() {{(pid=61439) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2014.781685] env[61439]: DEBUG nova.network.neutron [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Updating instance_info_cache with network_info: [] {{(pid=61439) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2014.793939] env[61439]: INFO nova.compute.manager [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] [instance: 9cc4b1c2-2582-4881-9572-0f261464eac3] Took 0.40 seconds to deallocate network for instance. [ 2014.901616] env[61439]: INFO nova.scheduler.client.report [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Deleted allocations for instance 9cc4b1c2-2582-4881-9572-0f261464eac3 [ 2014.922044] env[61439]: DEBUG oslo_concurrency.lockutils [None req-7776e6cb-ec13-4a4a-8313-5c860fc2b822 tempest-AttachVolumeTestJSON-1078261084 tempest-AttachVolumeTestJSON-1078261084-project-member] Lock "9cc4b1c2-2582-4881-9572-0f261464eac3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.975s {{(pid=61439) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}}